Do you remember that scene in Friends when Joey gets asked, “Let me ask you one question. Do your friends ever have a conversation and you just nod along even though you’re not really sure what they’re talking about?”
That’s exactly how I’ve felt about artificial intelligence (AI).
I know the term, I get the basic concept, but the moment someone starts talking about narrow AI, AI winters, and deep learning, I’m completely out of my depth — nodding along like Joey, hoping no one notices.
So, I decided to do both myself (and maybe you) a favor and answer the question: What is AI, how does it work, and how is it used in practice?
As usual, I’ve got the entire internet at my disposal — but this time, I also enlisted the help of Johan Åberg, our CPO here at lynes. Johan has actually conducted research in computer science at Linköping University, including work within artificial intelligence.
That said — don’t hold Johan accountable for this text (things might get awkward with his former research colleagues).
Want to become an AI expert? Let’s dive in.
What is AI?
After sitting down with Johan, my short answer to what AI is goes something like this:
Artificial intelligence (AI) — or machine intelligence, as it’s also called — is a machine’s ability to mimic human behavior and natural intelligence.
It includes cognitive functions such as learning from experience, solving problems, and planning and executing sequences of actions to generalize outcomes.
It’s also the name of the field of study that explores how to create computer programs capable of intelligent behavior.
When was AI “invented”?
John McCarthy coined the term artificial intelligence and defined it at the Dartmouth Conference in 1956, widely regarded as AI’s official birth.
That said, Alan Turing had already been working on “intelligent machines” back in the 1940s when he cracked the Enigma code, a breakthrough that helped the Allies win World War II.
At Dartmouth, a group of bearded men gathered to announce:
“We propose that a 2-month, 10-man study of artificial intelligence be carried out during the summer of 1956 at Dartmouth College… We think that a significant advance can be made… if a carefully selected group of scientists work on it together for a summer.”
— McCarthy, 1955
And just like that, AI was “invented” — and defined. Simple, right? Well, sort of.
The Dunning-Kruger effect of technology
To understand AI’s evolution, it helps to look at Gartner’s Technology Hype Cycle, which maps out five phases of technological maturity:
- Innovation trigger
- Peak of inflated expectations
- Trough of disillusionment
- Slope of enlightenment
- Plateau of productivity
This curve illustrates how a technology emerges, gets overhyped, then matures into something genuinely useful.
The progress of AI research
A lot has happened since the 1950s. Thankfully, it’s no longer just men with beards pushing the field forward — today, AI research is driven by people of all genders across the world.
A key player is OpenAI, originally founded as a non-profit by Elon Musk and others. They conduct ethical AI research — basically, making sure the robots don’t take over the world.
Among their most famous tools are DALL·E 2 (which creates images from text prompts) and text summarization models (which can read an entire book and give you a short summary).
For example, if you type “dog walking in the desert with an astronaut, in retro style,” DALL·E 2 will generate exactly that image.
Swedish AI research
Sweden is also in the game. The Wallenberg AI, Autonomous Systems and Software Program (WASP) is one of the largest research initiatives in Swedish academic history.
A key reason it’s important? Most AI training data is in English — which means languages like Swedish lack the same depth of datasets, especially when you throw in dialects. (Sorry, Skåne.)
AI in everyday life
AI is already everywhere — even if we don’t always notice it.
You encounter it when you:
- Use self-driving cars (hi, Tesla)
- Search the web
- Live in a “smart home” that adjusts lighting or temperature automatically
- Shop online (it’s not a coincidence that ads for air fryers follow you everywhere)
AI in business telephony and contact centers
So, what about AI in business communication?
To make it work effectively, AI must operate in real time and on the web.
Fast AI (real-time) solutions are becoming the standard — not just accurate, but lightning quick, running on new specialized chips from Intel, AMD, NVIDIA, and Apple.
Web AI is powered by technologies like:
- WebAssembly, which lets heavy computations run directly in your browser
- WebGL and WebGPU, enabling 3D rendering and direct access to your computer’s graphics processor
- TensorFlow.js, an open-source AI framework for browsers
AI features in lynes
Now that the tech foundation is here, we’ve begun implementing (and expanding) AI-powered functionality in lynes.
Here are a few examples already live:
- Face detection in video calls that automatically zooms and frames participants
- Selfie segmentation, which recognizes your silhouette and lets you blur or replace your background
- Noise cancellation, which intelligently filters out background sounds to enhance speech clarity
These features make your calls and meetings smoother, smarter, and a lot more professional.
The future of AI in lynes
When I asked Johan what’s next, he mentioned speech-to-text — automatically transcribing and summarizing customer support calls.
Another fascinating possibility is sentiment analysis, where AI identifies whether a customer is happy or frustrated based on tone and language, and then categorizes the call accordingly.
Looking further ahead, we might see general AI capable of autonomously handling customer support in natural language — a self-learning bot that interacts with users intelligently. (Chills, right?)
Summary
AI has been on quite a journey — from bearded scientists at Dartmouth to face detection and voice recognition in apps like lynes.
To wrap it all up in my own “AI-style summary”:
Joey from Friends isn’t the brightest. Filip doesn’t know much about AI. AI has seasons. Filip talks to Johan, who’s done AI research. Alan Turing ends the war, bearded men host a conference, Gartner uses fancy words, WASP is not the band from the 80s, Intel and AMD make chips, and lynes now has AI thanks to Johan — and because more is always better.
Voilà!