Social Engineering

Social engineering is the art of manipulating people into giving away private information or access to restricted places. Instead of breaking in or hacking systems technically, they use our natural trust against us. Today, these tricks have gotten more advanced with the use of modern technologies like deepfake videos, realistic phishing emails, and AI-driven chatbots that can have convincing conversations. These tools make it even easier for attackers to trick people by appearing trustworthy or familiar.

Social Engineering Methodologies

  • Impersonation: The practice of pretending to be another person with the goal of obtaining information or access to a person, company, or computer system.
  • Deepfake Technology: Attackers use AI (Artificial Intelligence) and machine learning to create convincing audio and video clips of trusted individuals, such as CEOs or public figures, to manipulate victims into performing actions or revealing sensitive information.
  • Vishing: (voice or VoIP phishing) is a fraud tactic in which individuals are tricked into revealing critical financial or personal information to unauthorized entities. The difference between phishing and vishing is that the fraudster makes direct contact over the phone.
  • Phishing
  • QR Code Phishing: With the rise of QR code usage for menus, payments, and information sharing, attackers embed malicious URLs in QR codes. Unsuspecting users scan these codes, leading them to phishing sites or malware downloads.
  • AI-Powered Chatbots: These are used on fake websites mimicking legitimate businesses. When users seek assistance via chat, these AI bots can manipulate conversations to extract personal details or financial information.
  • Emergency Scams: When a con-artist poses as a friend or family member and requests money to help them out of a difficult situation. Visit the Canadian Anti-Fraud Centre for more information.

FAQs

It is much easier to fool someone into reveal their password than it is to guess or crack the password. Social Engineering scammers know how to exploit your natural inclination to trust. The weakest link in the security chain is the human who accepts a person or scenario at face value. Since social engineering involves a human element, preventing these attacks can be tricky.

Many people at the university have access to things like:

  • Sensitive personal information.
  • Financial information.
  • Bank accounts.
  • Restricted areas of the university.
  • Access to the university after hours.

These things can be used by criminals for financial gain. If you are entrusted with access to anything at the university, it is critical that you ensure that anyone you deal with is in fact who they claim to be and that they have a legitimate reason to have access. Recently scammers have been making cold calls to some universities’ staff, indicating they sell equipment. After the initial call, they begin to follow up via email to try and convince the staff to review an attached equipment list. In this attachment is Ransomware. Their goal is to buddy up with individuals so you will open the attachment.

It doesn’t matter how many locks you have on your door if you simply open the door.

    • Do not give away any of your sensitive information to a stranger. Make sure that the person you are dealing with is exactly who they claim to be.
    • Avoid social engineering schemes by only giving information to people who really have a need for the information.
    • Limit the personal information you share online. Social engineers often use publicly available information to make their ploys more convincing.
    • Learn about common social engineering tactics and share this knowledge with friends, family, and colleagues to make them aware of these risks.

If you have any question or concerns regarding Social Engineering please contact the Help Desk at x4357 or email us at itsecurity@brocku.ca

In February 2024, a finance employee based in Hong Kong working for a multinational firm participated in what appeared to be a routine video conference call with individuals he believed to be the company’s CFO and other staff members. However, the figures in the video were actually deepfakes—sophisticated AI-generated replicas designed to deceive. Believing he was interacting with legitimate company executives; the finance worker was tricked into transferring $25 million to accounts controlled by the fraudsters.  

This incident underscores the dangerous potential of deepfake technology when used maliciously to impersonate trusted individuals in critical financial operations.