As computing technology evolved and the concept of linking multiple computers together into a “network” that could share data came into being, it was clear that a model was needed to define and enable those connections. Enter the OSI model in computer network idea.


This model allows various devices and software to “communicate” with one another by creating a set of universal rules and functions. Let’s dig into what the model entails.


History of the OSI Model


In the late 1970s, the continued development of computerized technology saw many companies start to introduce their own systems. These systems stood alone from others. For example, a computer at Retailer A has no way to communicate with a computer at Retailer B, with neither computer being able to communicate with the various vendors and other organizations within the retail supply chain.


Clearly, some way of connecting these standalone systems was needed, leading to researchers from France, the U.S., and the U.K. splitting into two groups – The International Organization for Standardization and the International Telegraph and Telephone Consultive Committee.


In 1983, these two groups merged their work to create “The Basic Reference Model for Open Systems Interconnection (OSI).” This model established industry standards for communication between networked devices, though the path to OSI’s implementation wasn’t as clear as it could have been. The 1980s and 1990s saw the introduction of another model – The TCP IP model – which competed against the OSI model for supremacy. TCP/IP gained so much traction that it became the cornerstone model for the then-budding internet, leading to the OSI model in computer network applications falling out of favor in many sectors. Despite this, the OSI model is still a valuable reference point for students who want to learn more about networking and still have some practical uses in industry.


The OSI Reference Model


The OSI model works by splitting the concept of computers communicating with one another into seven computer network layers (defined below), each offering standardized rules for its specific function. During the rise of the OSI model, these layers worked in concert, allowing systems to communicate as long as they followed the rules.


Though the OSI model in computer network applications has fallen out of favor on a practical level, it still offers several benefits:


  • The OSI model is perfect for teaching network architecture because it defines how computers communicate.
  • OSI is a layered model, with separation between each layer, so one layer doesn’t affect the operation of any other.
  • The OSI model offers flexibility because of the distinctions it makes between layers, with users being able to replace protocols in any layer without worrying about how they’ll impact the other layers.

The 7 Layers of the OSI Model


The OSI reference model in computer network teaching is a lot like an onion. It has several layers, each standing alone but each needing to be peeled back to get a result. But where peeling back the layers of an onion gets you a tasty ingredient or treat, peeling them back in the OSI model delivers a better understanding of networking and the protocols that lie behind it.


Each of these seven layers serves a different function.


Layer 1: Physical Layer


Sitting at the lowest level of the OSI model, the physical layer is all about the hows and wherefores of transmitting electrical signals from one device to another. Think of it as the protocols needed for the pins, cables, voltages, and every other component of a physical device if said device wants to communicate with another that uses the OSI model.


Layer 2: Data Link Layer


With the physical layer in place, the challenge shifts to transmitting data between devices. The data layer defines how node-to-node transfer occurs, allowing for the packaging of data into “frames” and the correction of errors that may happen in the physical layer.


The data layer has two “sub-layers” of its own:


  • MAC – Media Access Controls that offer multiplexing and flow control to govern a device’s transmissions over an OSI network.
  • LLC – Logical Link Controls that offer error control over the physical media (i.e., the devices) used to transmit data across a connection.

Layer 3: Network Layer


The network layer is like an intermediary between devices, as it accepts “frames” from the data layer and sends them on their way to their intended destination. Think of this layer as the postal service of the OSI model in computer network applications.



Layer 4: Transport Layer


If the network layer is a delivery person, the transport layer is the van that the delivery person uses to carry their parcels (i.e., data packets) between addresses. This layer regulates the sequencing, sizing, and transferring of data between hosts and systems. TCP (Transmission Control Protocol) is a good example of a transport layer in practical applications.


Layer 5: Session Layer


When one device wants to communicate with another, it sets up a “session” in which the communication takes place, similar to how your boss may schedule a meeting with you when they want to talk. The session layer regulates how the connections between machines are set up and managed, in addition to providing authorization controls to ensure no unwanted devices can interrupt or “listen in” on the session.


Layer 6: Presentation Layer


Presentation matters when sending data from one system to another. The presentation layer “pretties up” data by formatting and translating it into a syntax that the recipient’s application accepts. Encryption and decryption is a perfect example, as a data packet can be encrypted to be unreadable to anybody who intercepts it, only to be decrypted via the presentation layer so the intended recipient can see what the data packet contains.


Layer 7: Application Layer


The application layer is a front end through which the end user can interact with everything that’s going on behind the scenes in the network. It’s usually a piece of software that puts a user-friendly face on a network. For instance, the Google Chrome web browser is an application layer for the entire network of connections that make up the internet.


Interactions Between OSI Layers


Though each of the OSI layers in computer networks is independent (lending to the flexibility mentioned earlier), they must also interact with one another to make the network functional.


We see this most obviously in the data encapsulation and de-encapsulation that occurs in the model. Encapsulation is the process of adding information to a data packet as it travels, with de-encapsulation being the method used to remove that data added data so the end user can read what was originally sent. The previously mentioned encryption and decryption of data is a good example.


That process of encapsulation and de-encapsulation defines how the OSI model works. Each layer adds its own little “flavor” to the transmitted data packet, with each subsequent layer either adding something new or de-encapsulating something previously added so it can read the data. Each of these additions and subtractions is governed by the protocols set within each layer. A perfect network can only exist if these protocols properly govern data transmission, allowing for communication between each layer.


Real-World Applications of the OSI Model


There’s a reason why the OSI model in computer network study is often called a “reference” model – though important, it was quickly replaced with other models. As a result, you’ll rarely see the OSI model used as a way to connect devices, with TCP/IP being far more popular. Still, there are several practical applications for the OSI model.


Network Troubleshooting and Diagnostics


Given that some modern computer networks are unfathomably complex, picking out a single error that messes up the whole communication process can feel like navigating a minefield. Every wrong step causes something else to blow up, leading to more problems than you solve. The OSI model’s layered approach offers a way to break down the different aspects of a network to make it easier to identify problems.


Network Design and Implementation


Though the OSI model has few practical purposes, as a theoretical model it’s often seen as the basis for all networking concepts that came after. That makes it an ideal teaching tool for showcasing how networks are designed and implemented. Some even refer to the model when creating networks using other models, with the layered approach helping understand complex networks.


Enhancing Network Security


The concept of encapsulation and de-encapsulation comes to the fore again here (remember – encryption), as this concept shows us that it’s dangerous to allow a data packet to move through a network with no interactions. The OSI model shows how altering that packet as it goes on its journey makes it easier to protect data from unwanted eyes.



Limitations and Criticisms of the OSI Model


Despite its many uses as a teaching tool, the OSI model in computer network has limitations that are the reasons why it sees few practical applications:


  • Complexity – As valuable as the layered approach may be to teaching networks, it’s often too complex to execute in practice.
  • Overlap – The very flexibility that makes OSI great for people who want more control over their networks can come back to bite the model. The failure to implement proper controls and protocols can lead to overlap, as can the layered approach itself. Each of the computer network layers needs the others to work.
  • The Existence of Alternatives – The OSI model walked so other models could run, establishing many fundamental networking concepts that other models executed better in practical terms. Again, the massive network known as the internet is a great example, as it uses the TCP/IP model to reduce complexity and more effectively transmit data.

Use the OSI Reference Model in Computer Network Applications


Though it has little practical application in today’s world, the OSI model in computer network terms is a theoretical model that played a crucial role in establishing many of the “rules” of networking still used today. Its importance is still recognized by the fact that many computing courses use the OSI model to teach the fundamentals of networks.


Think of learning about the OSI model as being similar to laying the foundations for a house. You’ll get to grips with the basic concepts of how networks work, allowing you to build up your knowledge by incorporating both current networking technology and future advancements to become a networking specialist.

Related posts

Juggling Work and Study: Interview With OPIT Student Karina
OPIT - Open Institute of Technology
OPIT - Open Institute of Technology
Jun 5, 2025 6 min read

During the Open Institute of Technology’s (OPIT’s) 2025 Graduation Day, we conducted interviews with many recent graduates to understand why they chose OPIT, how they felt about the course, and what advice they might give to others considering studying at OPIT.

Karina is an experienced FinTech professional who is an experienced integration manager, ERP specialist, and business analyst. She was interested in learning AI applications to expand her career possibilities, and she chose OPIT’s MSc in Applied Data Science & AI.

In the interview, Karina discussed why she chose OPIT over other courses of study, the main challenges she faced when completing the course while working full-time, and the kind of support she received from OPIT and other students.

Why Study at OPIT?

Karina explained that she was interested in enhancing her AI skills to take advantage of a major emerging technology in the FinTech field. She said that she was looking for a course that was affordable and that she could manage alongside her current demanding job. Karina noted that she did not have the luxury to take time off to become a full-time student.

She was principally looking at courses in the United States and the United Kingdom. She found that comprehensive courses were expensive, costing upwards of $50,000, and did not always offer flexible study options. Meanwhile, flexible courses that she could complete while working offered excellent individual modules, but didn’t always add up to a coherent whole. This was something that set OPIT apart.

Karina admits that she was initially skeptical when she encountered OPIT because, at the time, it was still very new. OPIT only started offering courses in September 2023, so 2025 was the first cohort of graduates.

Nevertheless, Karina was interested in OPIT’s affordable study options and the flexibility of fully remote learning and part-time options. She said that when she looked into the course, she realized that it aligned very closely with what she was looking for.

In particular, Karina noted that she was always wary of further study because of the level of mathematics required in most computer science courses. She appreciated that OPIT’s course focused on understanding the underlying core principles and the potential applications, rather than the fine programming and mathematical details. This made the course more applicable to her professional life.

OPIT’s MSc in Applied Data Science & AI

The course Karina took was OPIT’s MSc in Applied Data Science & AI. It is a three- to four-term course (13 weeks), which can take between one and two years to complete, depending on the pace you choose and whether you choose the 90 or 120 ECTS option. As well as part-time, there are also regular and fast-track options.

The course is fully online and completed in English, with an accessible tuition fee of €2,250 per term, which is €6,750 for the 90 ECTS course and €9,000 for the 120 ECTS course. Payment plans are available as are scholarships, and discounts are available if you pay the full amount upfront.

It matches foundational tech modules with business application modules to build a strong foundation. It then ends with a term-long research project culminating in a thesis. Internships with industry partners are encouraged and facilitated by OPIT, or professionals can work on projects within their own companies.

Entry requirements include a bachelor’s degree or equivalency in any field, including non-tech fields, and English proficiency to a B2 level.

Faculty members include Pierluigi Casale, a former Data Science and AI Innovation Officer for the European Parliament and Principal Data Scientist at TomTom; Paco Awissi, former VP at PSL Group and an instructor at McGill University; and Marzi Bakhshandeh, a Senior Product Manager at ING.

Challenges and Support

Karina shared that her biggest challenge while studying at OPIT was time management and juggling the heavy learning schedule with her hectic job. She admitted that when balancing the two, there were times when her social life suffered, but it was doable. The key to her success was organization, time management, and the support of the rest of the cohort.

According to Karina, the cohort WhatsApp group was often a lifeline that helped keep her focused and optimistic during challenging times. Sharing challenges with others in the same boat and seeing the example of her peers often helped.

The OPIT Cohort

OPIT has a wide and varied cohort with over 300 students studying remotely from 78 countries around the world. Around 80% of OPIT’s students are already working professionals who are currently employed at top companies in a variety of industries. This includes global tech firms such as Accenture, Cisco, and Broadcom, FinTech companies like UBS, PwC, Deloitte, and the First Bank of Nigeria, and innovative startups and enterprises like Dynatrace, Leonardo, and the Pharo Foundation.

Study Methods

This cohort meets in OPIT’s online classrooms, powered by the Canvas Learning Management System (LMS). One of the world’s leading teaching and learning software, it acts as a virtual hub for all of OPIT’s academic activities, including live lectures and discussion boards. OPIT also uses the same portal to conduct continuous assessments and prepare students before final exams.

If you want to collaborate with other students, there is a collaboration tab where you can set up workrooms, and also an official Slack platform. Students tend to use WhatsApp for other informal communications.

If students need additional support, they can book an appointment with the course coordinator through Canvas to get advice on managing their workload and balancing their commitments. Students also get access to experienced career advisor Mike McCulloch, who can provide expert guidance.

A Supportive Environment

These services and resources create a supportive environment for OPIT students, which Karina says helped her throughout her course of study. Karina suggests organization and leaning into help from the community are the best ways to succeed when studying with OPIT.

Read the article
Leading in the Digital Age: Navigating Strategy in the Metaverse
OPIT - Open Institute of Technology
OPIT - Open Institute of Technology
Jun 5, 2025 5 min read

In April 2025, Professor Francesco Derchi from the Open Institute of Technology (OPIT) and Chair of OPIT’s Digital Business programs entered the online classroom to talk about the current state of the Metaverse and what companies can do to engage with this technological shift. As an expert in digital marketing, he is well-placed to talk about how brands can leverage the Metaverse to further company goals.

Current State of the Metaverse

Francesco started by exploring what the Metaverse is and the rocky history of its development. Although many associate the term Metaverse with Mark Zuckerberg’s 2021 announcement of Meta’s pivot toward a virtual immersive experience co-created by users, the concept actually existed long before. In his 1992 novel Snow Crash, author Neal Stephenson described a very similar concept, with people using avatars to seamlessly step out of the real world and into a highly connected virtual world.

Zuckerberg’s announcement was not even the start of real Metaverse-like experiences. Released in 2003, Second Life is a virtual world in which multiple users come together and engage through avatars. Participation in Second Life peaked at about one million active users in 2007. Similarly, Minecraft, released in 2011, is a virtual world where users can explore and build, and it offers multiplayer options.

What set Zuckerberg’s vision apart from these earlier iterations is that he imagined a much broader virtual world, with almost limitless creation and interaction possibilities. However, this proved much more difficult in practice.

Both Meta and Microsoft started investing significantly in the Metaverse at around the same time, with Microsoft completing its acquisition of Activision Blizzard – a gaming company that creates virtual world games such as World of Warcraft – in 2023 and working with Epic Games to bring Fortnite to their Xbox cloud gaming platform.

But limited adoption of new Metaverse technology saw both Meta and Microsoft announce major layoffs and cutbacks on their Metaverse investments.

Open Garden Metaverse

One of the major issues for the big Metaverse vision is that it requires an open-garden Metaverse. Matthew Ball defined this kind of Metaverse in his 2022 book:

“A massively scaled and interoperable network of real-time rendered 3D virtual worlds that can be experienced synchronously and persistently by an effectively unlimited number of users with an individual sense of presence, and with continuity of data, such as identity, history, entitlements, objects, communication, and payments.”

This vision requires an open Metaverse, a virtual world beyond any single company’s walled garden that allows interaction across platforms. With the current technology and state of the market, this is believed to be at least 10 years away.

With that in mind, Zuckerberg and Meta have pivoted away from expanding their Metaverse towards delivering devices such as AI glasses with augmented reality capabilities and virtual reality headsets.

Nevertheless, the Metaverse is still expanding today, but within walled garden contexts. Francesco pointed to Pokémon Go and Roblox as examples of Metaverse-esque words with enormous engagement and popularity.

Brands Engaging with the Metaverse: Nike Case Study

What does that mean for brands? Should they ignore the Metaverse until it becomes a more realistic proposition, or should they be establishing their Meta presence now?

Francesco used Nike’s successful approach to Meta engagement to show how brands can leverage the Metaverse today.

He pointed out that this was a strategic move from Nike to protect their brand. As a cultural phenomenon, people will naturally bring their affinity with Nike into the virtual space with them. If Nike doesn’t constantly monitor that presence, they can lose control of it. Rather than see this as a threat, Nike identified it as an opportunity. As people engage more online, their virtual appearance can become even more important than their physical appearance. Therefore, there is a space for Nike to occupy in this virtual world as a cultural icon.

Nike chose an ad hoc approach, going to users where they are and providing experiences within popular existing platforms.

As more than 1.5 million people play Fortnite every day, Nike started there, first selling a variety of virtual shoes that users can buy to kit out their avatars.

Roblox similarly has around 380 million monthly active users, so Nike entered the space and created NIKELAND, a purpose-built virtual area that offers a unique brand experience in the virtual world. For example, during NBA All-Star Week, LeBron James visited NIKELAND, where he coached and engaged with players. During the FIFA World Cup, NIKELAND let users claim two free soccer jerseys to show support for their favorite teams. According to statistics published at the end of 2023, in less than two years, NIKELAND had more than 34.9 million visitors, with over 13.4 billion hours of engagement and $185 million in NFT (non-fungible tokens or unique digital assets) sales.

Final Thoughts

Francesco concluded by discussing that while Nike has been successful in the Metaverse, this is not necessarily a success that will be simple for smaller brands to replicate. Nike was successful in the virtual world because they are a cultural phenomenon, and the Metaverse is a combination of technology and culture.

Therefore, brands today must decide how to engage with the current state of the Metaverse and prepare for its potential future expansion. Because existing Metaverses are walled gardens, brands also need to decide which Metaverses warrant investment or whether it is worth creating their own dedicated platforms. This all comes down to an appetite for risk.

Facing these types of challenges comes down to understanding the business potential of new technologies and making decisions based on risk and opportunity. OPIT’s BSc in Digital Business and MSc in Digital Business and Innovation help develop these skills, with Francesco also serving as program chair.

Read the article