Is it you? In the age of generative AI, this will be a defining question. To what extent is my work a reflection of my thought and effort? And to what extent is it the product of a machine or other entity? This is not a new question. A variant of it existed in ancient Greece. The admonition – Know Thyself – was inscribed on the temple of Apollo at Delphi. Attributed to Socrates, it asked each person to carefully consider what made them unique. This is sound advice then and now. Indeed, it serves as the foundation of any successful collaboration, allowing co-authors of a book or artistic work to figure out who does what and how credit is to be shared.
Now, machines can collaborate with us in new and unexpected ways. The intelligence may be different, but the interaction is similar to that with another human, albeit one who has read just about everything on the internet. And like a human, no model is all-knowing. The data used to train them remains partial and incomplete. We have yet to digitize much of the content found in the world’s many archives.
Whether one is partnering with a human or a large language model (LLM), the collaboration choices are similar. The first option is to have humans do all the work. In this case, generative AI is removed from the picture altogether. The second option is to allow the machine to do some of the work while other tasks, including oversight, are handled by humans. And finally, the third option is to outsource the entire task to the machine, letting it do the work with little to no human review.
All three options call for thoughtful consideration of where human thought is key to success and where it is less so. Clearly, there will not be a single correct answer in all cases. The expectation with most high-end luxury goods, for example, is that the work is done entirely by hand. Other services and products do not come with that level of expectation. Does a client really care if their attorney hand-wrote a contract if the generated one is practically free? Price sensitivity, in this case, will probably drive the implementation of generative AI.
In other cases, the choice to avoid generative AI will be personal. For example, a student may decide to master a skill first before outsourcing it to a machine. Here a desire to learn, to achieve mastery, takes precedence over efficiency. Consider a computer science student who decides to write C++ code by hand to gain a working knowledge of pointers and memory management before using Microsoft Co-Pilot. This approach makes sense because one cannot properly evaluate the work of a “collaborator” until they have an understanding of what the final product should look like from personal experience.
Generative AI is clearly going to change the collaborative landscape. For thousands of years, humans worked together to advance shared goals and projects. Success, then and now, depended on each participant knowing how their expertise and unique abilities contributed to the larger picture. Machine collaborators have now emerged – large language models and AI-enabled tools of all sorts. The underlying questions, however, remain the same. What am I uniquely qualified to do? Who do I want to be? What skills must I master to achieve my goals? And given those goals, what tasks should I let the machine do and which ones should I do myself?