If you have spent five minutes on tech news or product launch threads, you have probably seen Cluely. This new AI tool claims to whisper real-time suggestions during sales calls, job interviews, even first dates. Yes, really.

It’s the classic sitcom trope, Playing Cyrano: someone hiding in the background feeding you lines. And as every sitcom has taught us, that never ends well for the “hero.”

But in real life, it does not end with a laugh track.

It can end with broken trust, data breaches, and very real career consequences.

So while tools like this might promise better responses, more confidence, and a few extra wins, the real question is for our Cluely review not “does it work?”, it is “what does it cost you?”

Table of Contents

What Is Cluely?

Cluely AI is marketed as “real-time coaching” for everything from job interviews to sales calls. Cluely is an AI-powered app that listens to your screen and mic, then feeds you suggested responses through a hidden overlay. In theory, it’s a quiet confidence boost. In practice? It’s a high-tech cheat sheet.

Cluely describes itself as offering “invisible” coaching by monitoring your screen and audio, then feeding you what to say via a hidden overlay. It is pitched as a futuristic cheat code.

And while some salespeople might still dream of being the next Wolf of Wall Street, this feels more like sitcom-level cringe with potentially Black Mirror consequences.

Because beneath the surface, Cluely raises serious ethical, legal, and reputational questions.
Frankly, it looks like a red flag on wheels.

The kicker:

  • Reports suggest that one of Cluely’s founders was expelled from Columbia University following involvement with a similar tool designed to assist with interview cheating.
  • Their launch video appears to depict someone using Cluely to misrepresent their age and fake knowledge about art during a date. If people already worry about old photos on dating apps being misleading, this takes misrepresentation to a whole new level.
  • Cluely’s marketing seems to lean heavily into controversy, using outrage as a form of virality. This is often called ragebait marketing.

Co-founder Chungin “Roy” Lee first built a version called Interview Coder as a Columbia student, and used it to land an Amazon internship.

Columbia later expelled him after discovering he’d used AI to fake his way through technical interviews. Rather than backpedalling, Lee leaned into the scandal, rebranding the product as Cluely and raising $5.3 million to expand what he openly called a “cheating tool for literally everything.”

The viral launch video features Lee on a date, using Cluely real-time coaching via smart glasses to fake knowledge about art and lie about his age. He’s exposed when the waiter refuses him wine. It’s a scripted skit, but its message is clear: this AI helps you mislead.

While Lee defended the Cluely ad as satire, critics have called it manipulative, mirroring the unsettling undertones of Adolescence, where red pill ideology and male entitlement play out with violent consequences. In an era of increasingly urgent conversations around consent, AI, and coercion, the ad’s attempt at humour feels less like satire and more like a shrug at the kinds of manipulative behaviour we should be taking seriously—repackaged as comedy, but uncomfortably familiar.

The backlash is strong in just a few days we are seeing that companies are attempting to create “counter” tools, such as Validia launching Truely, to combat it specifically.

Public reaction was divided. Some admired the audacity. Others have seen it all as pure “ragebait” marketing. One Twitter/X user, Cody Blakeney, summed it up: “Imagine making a Black Mirror short as a product ad.” and MorningBrew made the same comparison, highlighting the dystopian undertone..

X user calls out Cluely for its black mirror vibes
Source: X

Below, you can see the viral video making the rounds for the tool:

Cluely’s Features: Why they raise concerns?

Cluely AI offers a set of features designed to help users during high-pressure scenarios, such as interviews, sales calls, and exams. But while the tool is framed as supportive, many of its functions are designed to operate without transparency, which is where the concerns begin.

Cluely features are touted as including:

  • Invisible operation during video calls and screen sharing, intended to go unnoticed
  • Bypasses monitoring tools such as screen recorders and keyloggers, complicating oversight
  • Quick-access keyboard shortcuts, allowing discreet use during live interactions
  • Support for all major programming languages, with a clear focus on technical interview use
  • Real-time overlays that display only to the user, hidden from others on the call
  • Customizable prompts and responses tailored to specific talking points or scenarios
  • Integrations with Zoom and other platforms, embedding the tool into common workflows

These features aren’t just about convenience, but effectively built for concealment.

When used in high-trust settings, that’s a problem. Whether it’s an interview panel, a client meeting, or an exam room, real-time hidden assistance turns performance into a bit of a lie.

Is Cluely Legal?

This is what most people are likely the most curious about as Cluely’s promise of real-time AI coaching raises serious compliance questions. While not outright illegal in most jurisdictions, the tool potentially treads into murky waters when it comes to privacy, consent, and misrepresentation laws.

The legal aspect of a tool like Cluely is nuanced, much like the various laws and regulations of countries, states, and other entities. At the time of writing, there was no “hard law” about Cluely specifically itself; however, based on the local governance here’s how it appears to be viewed globally.

This is not legal advice. The information below is based on public policy trends and regulatory interpretations as of April 2025. Always consult a qualified legal professional for jurisdiction-specific guidance.

Is Cluely Legal in the US?

Unclear. Cluely AI appears to exist in a legal grey area for the United States. There’s no specific federal law banning AI coaching apps, but using it in interviews, exams, or employment settings could violate fraud statutes or company policies. Misrepresenting qualifications with hidden tech could be grounds for termination or academic sanctions.

Is Cluely Legal in the UK?

Unclear. The UK is developing AI regulation (notably the proposed Artificial Intelligence Regulation Bill) focused on transparency and user consent. Cluely’s “invisible” overlay model could run afoul of those aims. While not yet banned, its design doesn’t sit comfortably with expected norms of disclosure and fairness.

Is Cluely Legal in Spain?

NO. Spain has moved toward strict rules requiring AI-generated content to be labelled. A tool that feeds unlabelled AI prompts during live interactions may conflict with this direction, especially in job interviews or sales environments where consent and clarity are key.

Is Cluely Legal in Brazil?

NO. Brazil’s LGPD (Lei Geral de Proteção de Dados) mirrors GDPR in requiring lawful, informed processing of personal data. Covertly analyzing a user’s environment and content, especially in professional settings, likely challenges LGPD’s consent and transparency rules.

Is Cluely Legal in Japan?

Unclear. Japan’s Act on the Protection of Personal Information (APPI) emphasizes consent and purpose-specific data usage. Cluely’s passive data collection and lack of third-party notification could raise compliance issues, particularly in workplaces or recorded interviews.

Is Cluely Legal in Germany?

NO. Germany has strict enforcement of GDPR, particularly around employee monitoring and consent. Tools like Cluely, which process behavioural data in real time without informing others, would likely be considered non-compliant—especially in the workplace.

Is Cluely Legal in France?

NO. France’s CNIL (data protection authority) takes a firm stance on consent, purpose limitation, and transparency. Using Cluely during interviews or calls—especially without disclosure—could violate not just GDPR, but national expectations around digital integrity.

Is Cluely Legal in Portugal?

NO. Portugal enforces GDPR strictly and has increased scrutiny of AI tools in education and employment. Undisclosed AI-generated support—particularly in recruitment contexts—could breach transparency requirements under national and EU law.

Is Cluely Legal in Russia?

Unclear. Russia has fewer specific restrictions on AI use, but surveillance and data sovereignty laws still apply. If Cluely captures sensitive business or personal content without disclosure, users could face scrutiny under data storage and localization laws—especially if the data is routed through external servers.

Is Cluely Legal in India?

Unclear. India’s new Digital Personal Data Protection Act (DPDPA) reinforces consent-based data handling. While enforcement is still developing, tools like Cluely would likely require clear disclosure and opt-in from all parties involved—something the product’s “invisible” model avoids.

Is Cluely GDPR Compliant?

NO. Under GDPR (relevant across the EU), tools that record screen or audio data need clear consent. Cluely provides no public evidence of GDPR compliance, and its silent operation raises red flags around lawful data processing and user rights.

tl;dr Guide To Cluely Legal Status

Here is a quick reference table for the location and potential legal status of Cluely for various regions, along with a link to the relevant regulations that it crosses against:

Country/RegionLegal StatusNotes
USAGrey areaInterview fraud, employer policy violations
State-by-State AI Laws
UKLikely non-compliantHidden use clashes with emerging AI law
AI Regulation Tracker – UK
SpainLikely non-compliantMandatory AI transparency, honesty in data use
Spain’s AI Transparency Law
GermanyLikely non-compliantStrong GDPR enforcement, especially at work
AI Laws & Regulations – Germany
FranceLikely non-compliantCNIL standards on AI ethics and consent
CNIL Recommendations on AI
PortugalLikely non-compliantGDPR and transparency in recruitment/education
AI Laws & Regulations – Portugal
BrazilPossibly non-compliantLGPD consent and data transparency
Brazil’s Digital Policy
JapanPotentially riskyConsent and limited data use under APPI
APPI Compliance in Japan
RussiaRisk depends on usageData localisation and surveillance laws
AI Regulations in Russia
IndiaCompliance unclearDPDPA still evolving, but consent critical
DPDPA Overview

Does Cluely Actually Help?


Let’s be honest: even after hearing about the risks, some people will still think, “But if I don’t get caught…”


So let’s talk outcomes. Because the truth is, even if you don’t get exposed, using Cluely can still work against you—professionally, financially, and socially.


Here’s how:


It’s a Distraction Mid-Call
Trying to talk while secretly scanning an AI overlay is a multitasking nightmare. Instead of listening and responding naturally, your focus is split. That delay? That awkward pause while you scan the suggestion? Everyone notices. You’re not more prepared—you’re half-present. Unless you’re practiced in the art of reading off an autocue, you’re likely to end up sounding like a total robot. 


Fake Expertise Smells Fake
Whether you’re pitching a client, answering a technical question, or trying to connect on a sales call—people know when you’re bluffing. AI might give you a fancy sentence or a factoid, but without depth or follow-up, it falls flat. You won’t sound impressive. You’ll sound coached.

A salesperson, in particular, is totally dependant on knowing their numbers, facts, and product details.

If you can’t evangelize what you are selling it’s hard to pass that belief onto potential prospects. 


Latency Kills Flow
AI responses—even the fastest ones—still have a lag. You wait. You scan. You say it late. That messes with conversational rhythm and makes you look unprepared. In fast-moving meetings, sales calls, or interviews, timing is everything. Cluely has the potential to absoluely wreck this. 


You Seem Less Credible, Not More
The irony? The tool that’s supposed to make you sound confident often has the opposite effect. If you can’t speak naturally or think on your feet, people notice. And whether they know it’s AI or not, they’ll remember you didn’t quite “click.” That can cost you deals, offers, and trust. Not to mention if you’re halfway through a call, interview, talk, whatever, and the tech breaks… well, it’s game over!

Why Real-Time AI Coaching Breaks Trust (And Stalls Growth)


Even if Cluely AI doesn’t get you caught outright during the call, it quietly chips away at something far more important: trust. In sales, interviews, or partnerships, people aren’t just buying what you say. They’re buying how you say it, how you listen, how you connect. Real-time coaching breaks that connection.


Real Sales = Real Relationships
Clients don’t want perfection. They want presence. Authenticity builds rapport. When you rely on an AI overlay to guide your words, the conversation starts to feel hollow. People might not be able to name what’s off, but they’ll feel it. And they’ll walk away. It won’t necessarily happen in the call, but a day, a week, a month, that lack of trust and belief across many calls, with many prospects, begins to compound. 


Shortcuts Erode Long-Term Reputation
Even if you land the deal or ace the interview, what happens when your performance doesn’t match your polish? Sooner or later, you’ll be expected to deliver without the script. If your skills don’t back it up, you’ll be remembered not for being impressive, but for being overhyped. Referring back to tropes, how many shows, books, films are a relationship starting off from a lie. The HUGE overcoming moment for that protagnist is being “found out”. Again, there’s usually a happy ending, but that’s make-believe… this is real life. 


You Can’t Grow If You’re Being Fed Lines
Cluely promises confidence, but it robs you of competence. Struggling through hard questions and awkward silences is where the real learning happens. If an AI feeds you the answer every time, you’re not improving. You’re outsourcing your growth. You will only be as good as the tool you are using, on the day you are using it, a vessel to regurgiate what’s being said, rather than being “good at your job”. Imagine if your biggest competitor started using the same tactic, but with more charisma. You’ve lost that edge immediately. 


Even If You’re Not Using Cluely
The rise of tools like Cluely is already having a ripple effect. In some sectors, candidates are being scrutinised more closely because of the assumption that someone is faking it. The Pragmatic Engineer recently reported on hiring managers becoming wary of “AI-enhanced” applicants who aced interviews but floundered on the job. That means the bar for trust is rising, even for people doing it the right way.


And If It Comes Out Later
Let’s say Cluely does help you win. Then six months later, someone finds out you used it. That retroactive hit to your integrity could cost you the very thing you earned. Reputation isn’t easy to build, but it’s shockingly easy to lose.

What are the alternatives to Cluely?

If what you’re really after is confidence, clarity, and better conversations, there are tools that can help, without crossing ethical lines. One option is tl;dv, a tool that helps you prepare before meetings, record with consent, and review insights after the fact.

The difference? tl;dv doesn’t pretend to whisper in your ear. It helps you listen better, remember more, and improve over time. You still show up as yourself, just with a little backup.

Here’s how ethical AI works:

  • Prepare before the meeting by reviewing past calls and key moments.
  • Record with permission, keeping everything above board.
  • Learn after the fact, using sales coaching features without betraying anyone’s trust in the moment.

Whatever tool you end up using, it’s worth asking a simple question: Would I be okay if someone did this to me?

If a recruiter recorded your call without telling you, or a colleague used AI to fake knowledge on a shared project, how would that feel?

Now flip it. Is that how you want to show up?

Be wary of any tool that sells you confidence through deception. If something feels icky, it probably is. And the fix it’s selling might cost more than you think, especially if it undermines your reputation, your growth, or your relationships.

So yes, get support. Use good tools. Learn well.

Just don’t trick people.

Want to move forward with ethical AI? tl;dv can be tried today, totally FREE OF CHARGE, with no “cheating” required. 

FAQs About Cluely

Cluely is an AI-powered tool marketed as “real-time coaching” for job interviews, sales calls, exams, and even dates. It runs invisibly on your screen and feeds you suggested responses during live conversations. While positioned as a productivity enhancer, it functions more like a digital cheat sheet.

Cluely has not made any clear public statement on GDPR compliance. Given that it monitors audio and screen content without notifying other parties, it likely raises issues under GDPR’s requirements for transparency, consent, and lawful data processing. In short, it’s risky, especially if you’re operating in or with anyone based in the EU.

It depends on where you are and how you use it. In many countries, using a hidden AI tool during interviews, meetings, or exams could violate policies or codes of conduct. Even if not strictly illegal, the ethical and professional consequences can be just as damaging.

Yes. Tools like tl;dv help you prepare for meetings, record (with consent), and learn from conversations afterwards. They support growth and clarity without trickery, making them more useful in the long run than tools that rely on deception.

Legal in some places? Possibly. Ethical? Much harder to argue. Cluely is designed to operate in secret. That alone puts it at odds with most professional standards, especially in situations where trust, fairness, or transparency are expected.

Cluely has a free plan with limited features and a premium plan starting at $20 per month. That may sound affordable, but if it jeopardises your credibility or violates policies, the real cost could be far higher.

Also, for a company that brands itself on cheating, is that a company you trust with your financial details?

No tool is undetectable forever. Awkward timing, mismatched tone, or screen recordings can all reveal use. If someone does find out, even later, the trust damage can be significant. The risk is real, even if it feels low. There are companies also actively looking at ways to “out” this tool at the moment, so while it may be undetectable for a time, it likely won’t stay that way. There are already tools such as Truely, and videos on YouTube of people “outing” the software using third-party tools.
Cluely is marketed for technical interviews, sales pitches, exam scenarios, and anywhere you might need support. In other words, moments where performance matters most. But it’s also where misrepresentation can do the most harm.
Cluely can compromise your career, your credibility, and your relationships — even if you’re never formally caught. Legal grey areas, broken trust, and the potential for backlash all make it a high-risk option. Especially in a world increasingly wary of AI fakery, using a tool like this might not just harm your reputation — it could hurt others’ chances too.