Viewing a single comment thread. View all comments

breadbrix t1_jdbdx1h wrote

Are you willing to bet your job/career on data pipelines created by GPT-4? In a PII/PHI/PCI-compliant environment? Where fines start at $10K per occurence?

Unless the answer is a resounding "Yes" then no, data engineering is not out of the door.

3

RemarkableGuidance44 t1_jdc2k75 wrote

haha exactly, the guy has never worked with data. Just imagine getting an Audit and not knowing if your data is right or not. It could of messed up big time and cost 100's of thousands.

2

of_patrol_bot t1_jdc2kt9 wrote

Hello, it looks like you've made a mistake.

It's supposed to be could've, should've, would've (short for could have, would have, should have), never could of, would of, should of.

Or you misspelled something, I ain't checking everything.

Beep boop - yes, I am a bot, don't botcriminate me.

−1

Mental-Egg-2078 OP t1_jdcn0j7 wrote

Fair point, but at what point are these things accepted as providing reasonable assurance (for things like audits)?

I get that the idea of data not being right is scary, but once governing bodies let the tools in, there is no going back. But I agree in situations where reasonable assurance is not acceptable then sure you don't want a predictive machine making critical choices.

1