Knowing how to code is certainly important to my job. But even more important is knowing how to help others write good code, unblocking them when they have a problem, helping define how to solve the larger problem than efficiently sorting a binary tree, being able to prioritize and breakdown big problems into small ones, and communicating all of the above to coworkers both technical and non. None of those are actually things that a LLM or MI model can solve.
I'm not worried for my job. I'm worried that information spaces (articles, images, video) both artistic and factual, are being filled with garbage. Would you trust the code written by ChatGPT to drive a car, or run a radiation therapy machine? How about for instructions and advice on how to make a cake? We already know they completely invent facts about people.
"AI" certainly has its uses, and it can find some pretty amazing associations from its data sources. But the biggest disruption they provide is making it seem like expertise and experience can be replaced or side-stepped, when it is actually both stolen and incompetent when examined.
Knowing how to code is certainly important to my job. But even more important is knowing how to help others write good code, unblocking them when they have a problem, helping define how to solve the larger problem than efficiently sorting a binary tree, being able to prioritize and breakdown big problems into small ones, and communicating all of the above to coworkers both technical and non. None of those are actually things that a LLM or MI model can solve.
I'm not worried for my job. I'm worried that information spaces (articles, images, video) both artistic and factual, are being filled with garbage. Would you trust the code written by ChatGPT to drive a car, or run a radiation therapy machine? How about for instructions and advice on how to make a cake? We already know they completely invent facts about people.
"AI" certainly has its uses, and it can find some pretty amazing associations from its data sources. But the biggest disruption they provide is making it seem like expertise and experience can be replaced or side-stepped, when it is actually both stolen and incompetent when examined.