Hamsalekha on Artificial Intelligence
The Kannada music director makes some stellar points on the use of AI in creative pursuits, which has important lessons to business as well.
Today, thanks to user “Dank Venkateshwara” (who I met in real life in San Francisco last month), I came across this video of Kannada music director Hamsalekha talking about AI:
(Watch from 22:00 to 23:30).
This is absolutely fascinating stuff. I asked Gemini (when it is YouTube content, I prefer Gemini) to paraphrase what he has said, in English. If you understand Kannada you should watch the original, but otherwise, this Gemini summary is good enough.
This is what Gemini said:
\begin{Gemini}
The segment from 22:00 to 23:30 discusses the role of Artificial Intelligence (AI) in music and how musicians, directors, and lyricists should approach it.
Here's a summary of the key points:
Embrace AI or become outdated Hamsalekha states that if artists don't work with AI, they will become "out of date" [22:17].
AI as a tool for content creation AI can generate content based on recorded material, but the artist needs to provide specific requests [22:29].
Human reconstruction is key The AI-generated content needs to be "reconstructed" by humans, incorporating human feelings, and then fed back to the AI for refinement [22:41].
Improving sound quality This iterative process of human reconstruction and AI processing can lead to incredible developments in sound quality [23:02].
Importance of human talent Ultimately, human talent remains crucial, as artists need to know what to ask from the AI [23:17].
\end{Gemini}
Here is one of the most legendary music directors of the Kannada industry (admittedly not the best when it comes to film music), who has been in action for at least 40 years, coming out with a massive endorsement of AI, and claiming that if you don’t embrace it you will be left behind.
This is in sharp contrast to writers and musicians elsewhere who have been crying “uncle” over all the AI produced stuff, saying it threatens their livelihood (I think there has been a Hollywood ban or protest or something on this stuff). There have also been numerous copyright claims by various authors and publishing houses who somehow claim that AI being trained on their material is a form of “cheating”.
I fully endorse what Hamsalekha has said here, and this has its impact well outside of music-making.
Basically the “alpha”, the creativity, has to be done by the human users. And then you get AI to execute on it. AI will never get it right in the first shot, and it is up to the human to iteratively prompt it to get the output he/she desires.
Whether copilot or not
This interview also gives an excellent framework on when you want a copilot model and when you want an AI that is much more prescriptive (like what we’re building at Babbage Insight). It basically has to do with the creativity of the user.
If the user is a creative person like Hamsalekha, the AI is a tool. The creativity comes from Hamsalekha’s thoughts, and AI is used to execute. He is also opinionated on what his music should sound like, and as he says in the interview, he has learnt to prompt the AI iteratively to get the kind of output that he is looking for. In other words, creativity is not enough - you even need to know how to use / prompt AI properly in order to to benefit from the copilot experience.
So in the hands of a creative and knowledgeable “pilot”, AI as a copilot can be a winning formula.
However, not everyone is as creative. Not everyone is as opinionated on what the output needs to look like. People just want to get the job done using AI, without expending too much effort.
And in such cases, you need the AI to step in. AI needs to be more opinionated, and function without needing too many instructions / prompts. Unlike Hamsalekha, the user in these cases neither exactly knows what he is looking for, nor can he prompt precisely to get what he wants.
Application to data analysis
When I started writing this, I didn’t intend this to become a work post, but it seems to have turned out that way!
So let’s use the above framework to determine whether copilots are appropriate or not for data analytics use cases.
There are some stellar data scientists who can just “do gymnastics with data”, do clever analyses and get the kind of insights that most peers won’t be able to. Similarly, there are also some stellar business people who know always to ask the right questions to figure out the insight from the data, even if they may not be that good at analysing the data themselves.
If this is the only persona that is going to use your “analytics AI”, then copilots are ideal.
However, consider the average user of an AI analytics tool. They just need the insights without having to do too much work. You give them a copilot and they might freeze not knowing what to ask it. They may not know what to do with the solution it has just given. They may not know how precisely to prompt the AI to give them what they want.
In this context, and I guess I’m just beating my own drum here, copilots are NOT the right tool. It is here that you need more opinionated AI. We had written about this in our corporate blog not long ago:
You should check out our latest post there on how Chat is not always the right interface to consume AI, and that in most “winning” use cases AI will quietly work in the background.
Like in Hamsalekha’s (possibly yet to be released) music.
Be like Hamsalekha.
PS: Now I know why people ask for copilots even if they really don’t really need one - they want to think that they are creative and know to use AI. And they want that illusion of control.