DEV Community

Shift Mag
Shift Mag

Posted on • Originally published at shiftmag.dev on

The Magic and Reality of AI: What can Generative AI actually do?

For more content like this subscribe to the ShiftMag newsletter.

Many things pass for AI, and sometimes it’s hard to put them under the common denominator, but Christine Spang still gives it a shot with a simple equation in her Shift Miami talk: Chatbots are cool, but what else can generative AI do?

Leverage = having a higher impact with a smaller input

AI = using computers to generate leverage

Computing itself, argues Christine, is about giving more leverage to individuals or groups, and the rise of LLMs has driven AI magic into new sets of use cases.

We used to carry water to the village; now we have tap water. We invented language, and then systems for storing information. We keep making better ways to use the data and information we have today – AI is our latest attempt at that.

Chatbots, knowledge bases, coding assistants

Today, the most notable use cases are user-facing chatbots, knowledge bases, and coding assistants (or, as often happens, some combination of the three).

Chatbots have come a long way from their initial instances and can now boast *great UI and conversational intelligence. *

Christine argues that coding assistance (or copilots) supercharges our coding powers, ensuring enhanced productivity and efficiency in the dev cycle.

AI-powered knowledge bases give us access to the right information at the time when we need it , not when a customer service agent is available or can schedule a call.

As a good example of that (and the benefits that AI brings), Christine mentioned her company’s own chatbot, Nylas Assist, a chat user interface for their docs. Launched in August 2023, it has reduced the number of raised tickets by 25% , even though the user base grew by 30% – that’s precisely the leverage she’s talking about

Language is messy; data should not be

All three use cases are pretty useful – but is this the peak AI we’re experiencing? We’ve probably all guessed that it’s not. At the moment, Christine notes, *we have generalized datasets, which only allow us to get generalized actions. *

Human language, on the other hand, is messy, full of nuances, and context-dependent. Communication is the bottleneck where most relevant information and context pass through, she states, and communication happens over many different channels, like messaging apps, email, voice, and social media, as well asynchronously.

Current LLMs are trained on the entire scrapable content available online. That’s a lot of text but not necessarily a lot of (right) context. LLMS trained on social media, Christine exemplifies, don’t necessarily have the right context for business.

Customization is the next step forward

Communication data is a haystack, she stresses, it’s not valuable as an unstructured pile.

Take, for example, email, which is Nylas’s bread–and–butter data. It is particularly messy, with plain text, images, links, and formatting. Passing that data to a model the right way is a challenge. Everything needs to be pre-processed in a specific way, extracting not just information but also information order. It needs to be structured to have value and the right context to move from generalized outputs.

Context is exactly how Christine sees generative AI evolving and generating even more leverage: When we access the context of a dataset rather than just the dataset itself, we can customize models and get customized actions instead of generalized ones.

The post The Magic and Reality of AI: What can Generative AI actually do? appeared first on ShiftMag.

Top comments (0)