Source:


By Lokesh Vij

Lokesh Vij is a Professor of BSc in Modern Computer Science & MSc in Applied Data Science & AI at Open Institute of Technology. With over 20 years of experience in cloud computing infrastructure, cybersecurity and cloud development, Professor Vij is an expert in all things related to data and modern computer science.

In today’s rapidly evolving technological landscape, the fields of blockchain and cloud computing are transforming industries, from finance to healthcare, and creating new opportunities for innovation. Integrating these technologies into education is not merely a trend but a necessity to equip students with the skills they need to thrive in the future workforce. Though both technologies are independently powerful, their potential for innovation and disruption is amplified when combined. This article explores the pressing questions surrounding the inclusion of blockchain and cloud computing in education, providing a comprehensive overview of their significance, benefits, and challenges.

The Technological Edge and Future Outlook

Cloud computing has revolutionized how businesses and individuals’ access and manage data and applications. Benefits like scalability, cost efficiency (including eliminating capital expenditure – CapEx), rapid innovation, and experimentation enable businesses to develop and deploy new applications and services quickly without the constraints of traditional on-premises infrastructure – thanks to managed services where cloud providers manage the operating system, runtime, and middleware, allowing businesses to focus on development and innovation. According to Statista, the cloud computing market is projected to reach a significant size of Euro 250 billion or even higher by 2028 (from Euro 110 billion in 2024), with a substantial Compound Annual Growth Rate (CAGR) of 22.78%. The widespread adoption of cloud computing by businesses of all sizes, coupled with the increasing demand for cloud-based services and applications, fuels the need for cloud computing professionals.

Blockchain, a distributed ledger technology, has paved the way by providing a secure, transparent, and tamper-proof way to record transactions (highly resistant to hacking and fraud). In 2021, European blockchain startups raised $1.5 billion in funding, indicating strong interest and growth potential. Reports suggest the European blockchain market could reach $39 billion by 2026, with a significant CAGR of over 47%. This growth is fueled by increasing adoption in sectors like finance, supply chain, and healthcare.

Addressing the Skills Gap

Reports from the World Economic Forum indicate that 85 million jobs may be displaced by a shift in the division of labor between humans and machines by 2025. However, 97 million new roles may emerge that are more adapted to the new division of labor between humans, machines, and algorithms, many of which will require proficiency in cloud computing and blockchain.

Furthermore, the World Economic Forum predicts that by 2027, 10% of the global GDP will be tokenized and stored on the blockchain. This massive shift means a surge in demand for blockchain professionals across various industries. Consider the implications of 10% of the global GDP being on the blockchain: it translates to a massive need for people who can build, secure, and manage these systems. We’re talking about potentially millions of jobs worldwide.

The European Blockchain Services Infrastructure (EBSI), an EU initiative, aims to deploy cross-border blockchain services across Europe, focusing on areas like digital identity, trusted data sharing, and diploma management. The EU’s MiCA (Crypto-Asset Regulation) regulation, expected to be fully implemented by 2025, will provide a clear legal framework for crypto-assets, fostering innovation and investment in the blockchain space. The projected growth and supportive regulatory environment point to a rising demand for blockchain professionals in Europe. Developing skills related to EBSI and its applications could be highly advantageous, given its potential impact on public sector blockchain adoption. Understanding the MiCA regulation will be crucial for blockchain roles related to crypto-assets and decentralized finance (DeFi).

Furthermore, European businesses are rapidly adopting digital technologies, with cloud computing as a core component of this transformation. GDPR (Data Protection Regulations) and other data protection laws push businesses to adopt secure and compliant cloud solutions. Many European countries invest heavily in cloud infrastructure and promote cloud adoption across various sectors. Artificial intelligence and machine learning will be deeply integrated into cloud platforms, enabling smarter automation, advanced analytics, and more efficient operations. This allows developers to focus on building applications without managing servers, leading to faster development cycles and increased scalability. Processing data closer to the source (like on devices or local servers) will become crucial for applications requiring real-time responses, such as IoT and autonomous vehicles.

The projected growth indicates a strong and continuous demand for blockchain and cloud professionals in Europe and worldwide. As we stand at the “crossroads of infinity,” there is a significant skill shortage, which will likely increase with the rapid adoption of these technologies. A 2023 study by SoftwareOne found that 95% of businesses globally face a cloud skills gap. Specific skills in high demand include cloud security, cloud-native development, and expertise in leading cloud platforms like AWS, Azure, and Google Cloud. The European Commission’s Digital Economy and Society Index (DESI) highlights a need for improved digital skills in areas like blockchain to support the EU’s digital transformation goals. A 2023 report by CasperLabs found that 90% of businesses in the US, UK, and China adopt blockchain, but knowledge gaps and interoperability challenges persist.

The Role of Educational Institutions

This surge in demand necessitates a corresponding increase in qualified individuals who can design, implement, and manage cloud-based and blockchain solutions. Educational institutions have a critical role to play in bridging this widening skills gap and ensuring a pipeline of talent ready to meet the demands of this burgeoning industry.

To effectively prepare the next generation of cloud computing and blockchain experts, educational institutions need to adopt a multi-pronged approach. This includes enhancing curricula with specialized programs, integrating cloud and blockchain concepts into existing courses, and providing hands-on experience with leading technology platforms.

Furthermore, investing in faculty development to ensure they possess up-to-date knowledge and expertise is crucial. Collaboration with industry partners through internships, co-teach programs, joint research projects, and mentorship programs can provide students with invaluable real-world experience and insights.

Beyond formal education, fostering a culture of lifelong learning is essential. Offering continuing education courses, boot camps, and online resources enables professionals to upskill or reskill and stay abreast of the latest advancements in cloud computing. Actively promoting awareness of career paths and opportunities in this field and facilitating connections with potential employers can empower students to thrive in the dynamic and evolving landscape of cloud computing and blockchain technologies.

By taking these steps, educational institutions can effectively prepare the young generation to fill the skills gap and thrive in the rapidly evolving world of cloud computing and blockchain.

Read the full article below:

Related posts

Wired: Think Twice Before Creating That ChatGPT Action Figure
OPIT - Open Institute of Technology
OPIT - Open Institute of Technology
May 12, 2025 6 min read

Source:

  • Wired, published on May 01st, 2025

People are using ChatGPT’s new image generator to take part in viral social media trends. But using it also puts your privacy at risk—unless you take a few simple steps to protect yourself.

By Kate O’Flaherty

At the start of April, an influx of action figures started appearing on social media sites including LinkedIn and X. Each figure depicted the person who had created it with uncanny accuracy, complete with personalized accessories such as reusable coffee cups, yoga mats, and headphones.

All this is possible because of OpenAI’s new GPT-4o-powered image generator, which supercharges ChatGPT’s ability to edit pictures, render text, and more. OpenAI’s ChatGPT image generator can also create pictures in the style of Japanese animated film company Studio Ghibli—a trend that quickly went viral, too.

The images are fun and easy to make—all you need is a free ChatGPT account and a photo. Yet to create an action figure or Studio Ghibli-style image, you also need to hand over a lot of data to OpenAI, which could be used to train its models.

Hidden Data

The data you are giving away when you use an AI image editor is often hidden. Every time you upload an image to ChatGPT, you’re potentially handing over “an entire bundle of metadata,” says Tom Vazdar, area chair for cybersecurity at Open Institute of Technology. “That includes the EXIF data attached to the image file, such as the time the photo was taken and the GPS coordinates of where it was shot.”

OpenAI also collects data about the device you’re using to access the platform. That means your device type, operating system, browser version, and unique identifiers, says Vazdar. “And because platforms like ChatGPT operate conversationally, there’s also behavioral data, such as what you typed, what kind of images you asked for, how you interacted with the interface and the frequency of those actions.”

It’s not just your face. If you upload a high-resolution photo, you’re giving OpenAI whatever else is in the image, too—the background, other people, things in your room and anything readable such as documents or badges, says Camden Woollven, group head of AI product marketing at risk management firm GRC International Group.

This type of voluntarily provided, consent-backed data is “a gold mine for training generative models,” especially multimodal ones that rely on visual inputs, says Vazdar.

OpenAI denies it is orchestrating viral photo trends as a ploy to collect user data, yet the firm certainly gains an advantage from it. OpenAI doesn’t need to scrape the web for your face if you’re happily uploading it yourself, Vazdar points out. “This trend, whether by design or a convenient opportunity, is providing the company with massive volumes of fresh, high-quality facial data from diverse age groups, ethnicities, and geographies.”

OpenAI says it does not actively seek out personal information to train models—and it doesn’t use public data on the internet to build profiles about people to advertise to them or sell their data, an OpenAI spokesperson tells WIRED. However, under OpenAI’s current privacy policy, images submitted through ChatGPT can be retained and used to improve its models.

Any data, prompts, or requests you share helps teach the algorithm—and personalized information helps fine tune it further, says Jake Moore, global cybersecurity adviser at security outfit ESET, who created his own action figure to demonstrate the privacy risks of the trend on LinkedIn.

Uncanny Likeness

In some markets, your photos are protected by regulation. In the UK and EU, data-protection regulation including the GDPR offer strong protections, including the right to access or delete your data. At the same time, use of biometric data requires explicit consent.

However, photographs become biometric data only when processed through a specific technical means allowing the unique identification of a specific individual, says Melissa Hall, senior associate at law firm MFMac. Processing an image to create a cartoon version of the subject in the original photograph is “unlikely to meet this definition,” she says.

Meanwhile, in the US, privacy protections vary. “California and Illinois are leading with stronger data protection laws, but there is no standard position across all US states,” says Annalisa Checchi, a partner at IP law firm Ionic Legal. And OpenAI’s privacy policy doesn’t contain an explicit carve-out for likeness or biometric data, which “creates a grey area for stylized facial uploads,” Checchi says.

The risks include your image or likeness being retained, potentially used to train future models, or combined with other data for profiling, says Checchi. “While these platforms often prioritize safety, the long-term use of your likeness is still poorly understood—and hard to retract once uploaded.”

OpenAI says its users’ privacy and security is a top priority. The firm wants its AI models to learn about the world, not private individuals, and it actively minimizes the collection of personal information, an OpenAI spokesperson tells WIRED.

Meanwhile, users have control over how their data is used, with self-service tools to access, export, or delete personal information. You can also opt out of having content used to improve models, according to OpenAI.

ChatGPT Free, Plus, and Pro users can control whether they contribute to future model improvements in their data controls settings. OpenAI does not train on ChatGPT Team, Enterprise, and Edu customer data⁠ by default, according to the company.

Read the full article below:

Read the article
LADBible and Yahoo News: Viral AI trend could present huge privacy concerns, says expert
OPIT - Open Institute of Technology
OPIT - Open Institute of Technology
May 12, 2025 4 min read

Source:


You’ve probably seen them all over Instagram

By James Moorhouse

Experts have warned against participating in a viral social media trend which sees people use ChatGPT to create an action figure version of themselves.

If you’ve spent any time whatsoever doomscrolling on Instagram or TikTok or dare I say it, LinkedIn recently, you’ll be all too aware of the viral trend.

Obviously, there’s nothing more entertaining and frivolous than seeing AI generated versions of your co-workers and their cute little laptops and piña coladas, but it turns out that it might not be the best idea to take part.

There may well be some benefits to artificial intelligence but often it can produce some pretty disturbing results. Earlier this year, a lad from Norway sued ChatGPT after it falsely claimed he had been convicted of killing two of his kids.

Unfortunately, if you don’t like AI, then you’re going to have to accept that it’s going to become a regular part of our lives. You only need to look at WhatsApp or Facebook messenger to realise that. But it’s always worth saying please and thank you to ChatGPT just in case society does collapse and the AI robots take over, in the hope that they treat you mercifully. Although it might cost them a little more electricity.

Anyway, in case you’re thinking of getting involved in this latest AI trend and sharing your face and your favourite hobbies with a high tech robot, maybe don’t. You don’t want to end up starring in your own Netflix series, à la Black Mirror.

Tom Vazdar, area chair for cybersecurity at Open Institute of Technology, spoke with Wired about some of the dangers of sharing personal details about yourself with AI.

Every time you upload an image to ChatGPT, you’re potentially handing over ‘an entire bundle of metadata’ he revealed.

Vazdar added: “That includes the EXIF data attached to the image file, such as the time the photo was taken and the GPS coordinates of where it was shot.

“Because platforms like ChatGPT operate conversationally, there’s also behavioural data, such as what you typed, what kind of images you asked for, how you interacted with the interface and the frequency of those actions.”

Essentially, if you upload a photo of your face, you’re not just giving AI access to your face, but also the whatever is in the background, such as the location or other people that might feature.

Vazdar concluded: “This trend, whether by design or a convenient opportunity, is providing the company with massive volumes of fresh, high-quality facial data from diverse age groups, ethnicities, and geographies.”

While we’re at it, maybe stop using ChatGPT for your university essays and general basic questions you can find the answer to on Google as well. The last thing you need is AI knowing you don’t know how to do something basic if it does takeover the world.

Read the full article below:

Read the article