When I was a kid at school, we learned how to research. I remember when I was about 14 or 15 years old, one of the UK's broadsheet newspapers ran a competition where they printed 20 questions every day for a month and the school that submitted the most correct answers by post before the deadline would win a prize for the school.
The questions were really hard. Definitely not "general" knowledge. They demanded a visit to the library and time spent poring over numerous texts to ensure the very nuanced questions were correctly answered. There were traps in the questions that anyone trying to cut corners could fall into. It was proper research.
The questions we really hard. Definitely not "general" knowledge.
There is no doubt in my mind that technology has accelerated our access to information, and mainly for the good, but maybe the price we pay is that we're all a little bit less inclined to put in the hard miles of proper research, which in itself brought learning?
Then, along comes AI in the form of ChatGPT. I've had a bit of play with it and I think I agree with the view that it might be the next major evolution of search. Again, for the most part this might be good. But perhaps not entirely...
I know it is currently limited by the scope of the data it is trained with, but then again, Google didn't have the whole of WWW indexed at first either, and we still had to use Ask Jeeves sometimes 😂
But I do worry that the AI tools appear to take away the need for that last bit of critical assessment of the information by presenting a fully formed answer to a prompt. Will we lose the ability to research, evaluate, form an opinion and present content that we really believe in? We might even fail to learn anything from the research because we're not engaged enough in the production of the end piece.
I've already started to spot content on various platforms, that I think the author has generated using ChatGPT. I've even invented a game that I call "AI Jeopardy" (after the eponymous game show). The rules are simple. Find a piece of content you think might have come from an AI, and try to create the prompt that resulted in something equating to that content. It's quite a lot of fun, but the more serious undertone for me is that the poster of that content immediately slips down the credibility stakes in my mind because I know (or at least suspect) that it isn't really written by them.
Even Google didn't have the whole of the WWW indexed at first.
Surely, as useful as the tools are as an accelerator, there's still a place for content creators to edit and apply their own knowledge, opinion, personality and tone of voice into a piece of writing? I hope we don't end up in a world where all copy is just an AI amalgam of loads of other writers' historic content. Where will the new thinking come from?
And don't get me started on AI Art - with it's soulless dead eyes and mangled hands... art needs to have a heart!