Cybercrime, AI, and the Equipment Dealership

Cybercrime, AI, and the Equipment Dealership

Guest writer Kevin Landers returns this week to highlight the relevance of cybersecurity in “Cybercrime, AI, and the Equipment Dealership.”

It sounds like the beginning of a bad film, but the reality is that the world of digital crime is continuing to develop as technology evolves. The latest and greatest evolution is AI. As AI unlocks our world to provide us with tools to improve our performance and capabilities in business, criminals are also increasingly using it to target equipment dealerships through sophisticated fraud tactics. Unfortunately, they are leveraging AI’s capabilities for social engineering, deepfakes, and synthetic identity creation. 

Social engineering has always been difficult to protect against, but AI is taking it to new heights. Cybercriminals use AI to automate and personalize phishing campaigns. Phishing is a type of cyberattack that involves tricking people into sharing sensitive information through fraudulent emails, text messages, phone calls, or websites. 

Phishing is the most common form of cybercrime, with an estimated 3.4 billion spam emails sent daily. Notoriously, over 10% of employees worldwide clicked on malicious links, and over 60% of those who clicked submitted a password on malicious websites. And dealerships need to have this on their radar, as employees at small organizations are more likely to click on malicious links. 

AI is changing the face of phishing attacks.

AI-driven tactics enable attackers to craft highly believable emails and messages that mimic real business correspondence, including the language, style, and branding typical of a dealership’s internal or external communications. 

Without the proper training and awareness in the desire to be helpful, do the right thing, or even do what the boss says, criminals can trick employees into clicking malicious links, sharing sensitive information, or transferring funds under the guise of routine transactions. 

Voice cloning technology is particularly alarming for equipment dealerships. Attackers use AI to create convincing voice calls that imitate senior executives or trusted partners. These calls often instruct employees to execute financial transactions or disclose sensitive data, leveraging the urgency and familiarity associated with an executive’s voice.

AI allows criminals to scale these attacks rapidly. Automated AI-powered scripts can launch large-scale attempts to breach systems, such as through credential stuffing, where stolen usernames and passwords are systematically tested across multiple platforms. This capability makes it easier for attackers to find vulnerabilities in dealership systems and gain unauthorized access.

For example, when the CDK data breach and outage took place earlier this year, dealerships suddenly began receiving calls “from CDK support” and emails from “CDK support” that were saying “hey, let us connect remotely with you so that we can get your CDK systems back up and running”. The issue was that these were all malicious actors posing as CDK that were attempting to trick dealerships into allowing them access to their systems. Taking advantage of the situation in seconds, not days.

And don’t underestimate the power of a chatbot. AI chatbots can engage in real-time interactions with employees, adapting their responses to seem more authentic and trustworthy, thereby increasing the success rate of these scams. It can feel like you are having a relationship with a real person or discussing plans with your boss. And with many businesses using tools like WhatsApp to discuss businesses outside of email or your network, it can be easy to fall for things when your attention is distracted. And criminals are taking advantage of this.

How can Dealerships defend themselves from AI phishing attacks?

  • Employee Training and Awareness: Regular cybersecurity training focused on AI-driven threats is crucial. Employees must learn to identify suspicious communications and verify requests through multiple channels. Making training fun, informative, and embedded into their workflow helps make this more effective. IT teams and support services must ensure training is appropriate for dealership scenarios. Training on spotting a fake date might be useful for your personal life but probably won’t help you decide if you should click on a link from what might look like the service manager.
  • Multi-Factor Authentication (MFA): Implementing MFA can significantly reduce the effectiveness of these attacks by adding layers of verification beyond simple passwords or voice recognition. We know this can be annoying, but a good IT solution will leverage ways to make it feel seamless yet powerful to protect.
  • AI-Driven Defense Solutions: We have to use AI to fight AI. Utilizing AI for defense is also vital. Systems that monitor for anomalies in behavior, voice patterns, and transaction data can detect and respond to fraudulent activity in real-time. 

As AI fraud evolves, equipment dealerships must stay proactive, combining advanced technology with a culture of vigilance and verification to safeguard against these sophisticated threats.