Cybersecurity jobs are better paid than you might expect. Given the skyrocketing significance of the data protection industry, that shouldn’t come as much of a surprise.

If you’ve been wondering what cybersecurity experts take home at the end of the day, this article will reveal the numbers involved. After all, knowing what you could earn gives a clearer picture of whether this field would be worthwhile endeavor for you or not.

Factors Influencing Cybersecurity Jobs Salary

First, let’s look at what makes one cybersecurity job pay more than another. The distinction goes beyond how good an individual is with computers or their knowledge around firewalls, proxies, and the like. Location is also a big factor. Living in or around tech hubs usually means higher pay, but also much higher living costs.

There is also the factor of experience to consider. The more years in the field, the more your salary will grow.

Also, your education level and qualifications matter, too. However, whatever your skill or education level, demand for cybersecurity talent is high. Businesses everywhere are looking for cyber defenders. Therefore, skilled professionals are in a position to negotiate for high pay for their services.

Cybersecurity Salary Overview: Entry-Level vs. Experienced Workers

So, how much does cybersecurity pay? It’s worth noting that you might not make a huge sum right away. But the growth potential within the field is impressive. Entry-level salaries are decent, but as you climb the ranks, your salary will increase accordingly.

It will also depend on the type of job you do. Penetration testers, security analysts, or CISO chairs all have different salaries at different levels. The main thing to remember is that pay will often reflect your skills, so honing them is the main way to help your future earnings soar.

In Europe, starting figures for cybersecurity salary are around €37,000, and can grow to significant sums as you climb the ranks, especially in roles like CISOs where salaries can reach upwards of €180,000.

How Much Do Cybersecurity Professionals Make Globally?

If you’re thinking of taking your cybersecurity talents on a global tour, know that the cybersecurity average salary varies widely around the world. The figures are influenced by local market demand, economic conditions, and living expenses. However, here is a thought worth considering: remote work is changing the way payments for these fields are distributed globally. Now, you can live in one country and work for a company in another, potentially earning more than the local rate. For that reason, it’s an exciting time to explore international opportunities and the global demand to make the most of your skills.

But how much do cybersecurity professionals make in Europe? The salaries are excellent, but can vary depending on where you live. For instance, if you’re eyeing a spot as an entry-level cybersecurity analyst in Germany, you might be looking at an average of €52,539 ($57,420) in 2024. The numbers naturally vary based on role, experience, and location.

Entry-level cybersecurity analysts in the U.S. have starting salaries of around $68,202, rising up to $112,000 as a median figure. Meanwhile, a cyber security engineer in Japan can expect an average cybersecurity salary of around ¥6,963,427 (approximately USD $55,300) in 2024

Maximizing Your Cybersecurity Salary Potential

While the earnings sound appealing, you need the skills and knowledge to harness your earning potential. To climb the ladder, you must make strategic choices to earn the perks, more than anything, through continuous education. Since technology moves at breakneck speeds, and cyber threats emerge even faster, keeping up and staying ahead of the game means being a lifelong learner. Put simply, your education doesn’t end once you leave college or complete a course. As such, you should get extra certificates to show prospective employers that you aren’t being left behind.

However, don’t neglect the power of networking, either. It goes beyond an online presence or having a LinkedIn account. Engage with the cyber security community, attend industry meetups, present your own findings and projects, and lend a hand with open-source projects. Get to know people and you’ll gain opportunities you wouldn’t find in a job ad.

If you’re looking for a place to boost the skills and connections, OPIT is here to help. OPIT’s career-aligned online programs can catapult you into higher-earning roles.

OPIT’s Master’s in Cybersecurity

OPIT’s master’s program in Cybersecurity is one of the most efficient ways to gain the skills and knowledge that can propel you into the upper echelons of cybersecurity. The program is more than a traditional academic education in computer science and cybersecurity. It’s a challenging undertaking but rewards you with knowledge that you can apply in real-life circumstances right away, through practical sessions and workshops.

Over the course of this program, you’ll tackle digital forensics, encryption, firewalls, security systems, and also the strategic thinking behind secure network design. After all, cybersecurity thrives on critical thinking in stress-intensive circumstances and being flexible and creative enough to come up with solutions “on the spot.” However, you’ll be well-equipped for these trials by learning from the best in the industry, people who’ve been at the forefront of cybersecurity debate for years.

High Risk, High Reward

The salaries people earn within cybersecurity sphere reflect the major demand in the field and the skills necessary to complete the job effectively. If you play your cards right, you might be protecting the systems and IT infrastructures of major businesses, nonprofits, or governmental organizations. However, to get to that point, you must learn, and never stop learning. Just as importantly, never underestimate the power of networking and maintaining good relationships.

Programs like OPIT’s master’s degree in cybersecurity are some of the best ways to hone the skills from anywhere in the world, learning from the best in the industry, all at your own pace. Give it a try and see how much of a difference it can make.

Related posts

Wired: Think Twice Before Creating That ChatGPT Action Figure
OPIT - Open Institute of Technology
OPIT - Open Institute of Technology
May 12, 2025 6 min read

Source:

  • Wired, published on May 01st, 2025

People are using ChatGPT’s new image generator to take part in viral social media trends. But using it also puts your privacy at risk—unless you take a few simple steps to protect yourself.

By Kate O’Flaherty

At the start of April, an influx of action figures started appearing on social media sites including LinkedIn and X. Each figure depicted the person who had created it with uncanny accuracy, complete with personalized accessories such as reusable coffee cups, yoga mats, and headphones.

All this is possible because of OpenAI’s new GPT-4o-powered image generator, which supercharges ChatGPT’s ability to edit pictures, render text, and more. OpenAI’s ChatGPT image generator can also create pictures in the style of Japanese animated film company Studio Ghibli—a trend that quickly went viral, too.

The images are fun and easy to make—all you need is a free ChatGPT account and a photo. Yet to create an action figure or Studio Ghibli-style image, you also need to hand over a lot of data to OpenAI, which could be used to train its models.

Hidden Data

The data you are giving away when you use an AI image editor is often hidden. Every time you upload an image to ChatGPT, you’re potentially handing over “an entire bundle of metadata,” says Tom Vazdar, area chair for cybersecurity at Open Institute of Technology. “That includes the EXIF data attached to the image file, such as the time the photo was taken and the GPS coordinates of where it was shot.”

OpenAI also collects data about the device you’re using to access the platform. That means your device type, operating system, browser version, and unique identifiers, says Vazdar. “And because platforms like ChatGPT operate conversationally, there’s also behavioral data, such as what you typed, what kind of images you asked for, how you interacted with the interface and the frequency of those actions.”

It’s not just your face. If you upload a high-resolution photo, you’re giving OpenAI whatever else is in the image, too—the background, other people, things in your room and anything readable such as documents or badges, says Camden Woollven, group head of AI product marketing at risk management firm GRC International Group.

This type of voluntarily provided, consent-backed data is “a gold mine for training generative models,” especially multimodal ones that rely on visual inputs, says Vazdar.

OpenAI denies it is orchestrating viral photo trends as a ploy to collect user data, yet the firm certainly gains an advantage from it. OpenAI doesn’t need to scrape the web for your face if you’re happily uploading it yourself, Vazdar points out. “This trend, whether by design or a convenient opportunity, is providing the company with massive volumes of fresh, high-quality facial data from diverse age groups, ethnicities, and geographies.”

OpenAI says it does not actively seek out personal information to train models—and it doesn’t use public data on the internet to build profiles about people to advertise to them or sell their data, an OpenAI spokesperson tells WIRED. However, under OpenAI’s current privacy policy, images submitted through ChatGPT can be retained and used to improve its models.

Any data, prompts, or requests you share helps teach the algorithm—and personalized information helps fine tune it further, says Jake Moore, global cybersecurity adviser at security outfit ESET, who created his own action figure to demonstrate the privacy risks of the trend on LinkedIn.

Uncanny Likeness

In some markets, your photos are protected by regulation. In the UK and EU, data-protection regulation including the GDPR offer strong protections, including the right to access or delete your data. At the same time, use of biometric data requires explicit consent.

However, photographs become biometric data only when processed through a specific technical means allowing the unique identification of a specific individual, says Melissa Hall, senior associate at law firm MFMac. Processing an image to create a cartoon version of the subject in the original photograph is “unlikely to meet this definition,” she says.

Meanwhile, in the US, privacy protections vary. “California and Illinois are leading with stronger data protection laws, but there is no standard position across all US states,” says Annalisa Checchi, a partner at IP law firm Ionic Legal. And OpenAI’s privacy policy doesn’t contain an explicit carve-out for likeness or biometric data, which “creates a grey area for stylized facial uploads,” Checchi says.

The risks include your image or likeness being retained, potentially used to train future models, or combined with other data for profiling, says Checchi. “While these platforms often prioritize safety, the long-term use of your likeness is still poorly understood—and hard to retract once uploaded.”

OpenAI says its users’ privacy and security is a top priority. The firm wants its AI models to learn about the world, not private individuals, and it actively minimizes the collection of personal information, an OpenAI spokesperson tells WIRED.

Meanwhile, users have control over how their data is used, with self-service tools to access, export, or delete personal information. You can also opt out of having content used to improve models, according to OpenAI.

ChatGPT Free, Plus, and Pro users can control whether they contribute to future model improvements in their data controls settings. OpenAI does not train on ChatGPT Team, Enterprise, and Edu customer data⁠ by default, according to the company.

Read the full article below:

Read the article
LADBible and Yahoo News: Viral AI trend could present huge privacy concerns, says expert
OPIT - Open Institute of Technology
OPIT - Open Institute of Technology
May 12, 2025 4 min read

Source:


You’ve probably seen them all over Instagram

By James Moorhouse

Experts have warned against participating in a viral social media trend which sees people use ChatGPT to create an action figure version of themselves.

If you’ve spent any time whatsoever doomscrolling on Instagram or TikTok or dare I say it, LinkedIn recently, you’ll be all too aware of the viral trend.

Obviously, there’s nothing more entertaining and frivolous than seeing AI generated versions of your co-workers and their cute little laptops and piña coladas, but it turns out that it might not be the best idea to take part.

There may well be some benefits to artificial intelligence but often it can produce some pretty disturbing results. Earlier this year, a lad from Norway sued ChatGPT after it falsely claimed he had been convicted of killing two of his kids.

Unfortunately, if you don’t like AI, then you’re going to have to accept that it’s going to become a regular part of our lives. You only need to look at WhatsApp or Facebook messenger to realise that. But it’s always worth saying please and thank you to ChatGPT just in case society does collapse and the AI robots take over, in the hope that they treat you mercifully. Although it might cost them a little more electricity.

Anyway, in case you’re thinking of getting involved in this latest AI trend and sharing your face and your favourite hobbies with a high tech robot, maybe don’t. You don’t want to end up starring in your own Netflix series, à la Black Mirror.

Tom Vazdar, area chair for cybersecurity at Open Institute of Technology, spoke with Wired about some of the dangers of sharing personal details about yourself with AI.

Every time you upload an image to ChatGPT, you’re potentially handing over ‘an entire bundle of metadata’ he revealed.

Vazdar added: “That includes the EXIF data attached to the image file, such as the time the photo was taken and the GPS coordinates of where it was shot.

“Because platforms like ChatGPT operate conversationally, there’s also behavioural data, such as what you typed, what kind of images you asked for, how you interacted with the interface and the frequency of those actions.”

Essentially, if you upload a photo of your face, you’re not just giving AI access to your face, but also the whatever is in the background, such as the location or other people that might feature.

Vazdar concluded: “This trend, whether by design or a convenient opportunity, is providing the company with massive volumes of fresh, high-quality facial data from diverse age groups, ethnicities, and geographies.”

While we’re at it, maybe stop using ChatGPT for your university essays and general basic questions you can find the answer to on Google as well. The last thing you need is AI knowing you don’t know how to do something basic if it does takeover the world.

Read the full article below:

Read the article