IoT - Engineering.com https://www.engineering.com/category/technology/iot/ Mon, 21 Oct 2024 13:44:37 +0000 en-US hourly 1 https://wordpress.org/?v=6.6.2 https://www.engineering.com/wp-content/uploads/2024/06/0-Square-Icon-White-on-Purplea-150x150.png IoT - Engineering.com https://www.engineering.com/category/technology/iot/ 32 32 6 Best Practices When Developing XR for Industrial Applications https://www.engineering.com/resources/6-best-practices-when-developing-xr-for-industrial-applications/ Mon, 21 Oct 2024 13:44:35 +0000 https://www.engineering.com/?post_type=resources&p=133025 Through Industry 4.0 and the industrial internet of things (IIoT), developers have brought industry into the digital realm. Industry experts can learn, control and share anything about a process with a few clicks. But these experts are still limited by their physical connections.

The post 6 Best Practices When Developing XR for Industrial Applications appeared first on Engineering.com.

]]>

Developers, however, can start to blend the physical and digital realms via technologies like virtual reality (VR), augmented reality (AR) and mixed reality (MR) — collectively referred to as extended reality (XR). But this dream is still in its infancy. As a result, developers need guidelines to ensure they are going down the correct path when creating XR experiences.

In this 7-page ebook, developers will learn:

  • How XR is bound to change industry.
  • Which challenges exist when making XR experiences for industry.
  • Six best practices to keep the development of industrial XR experiences on track.
  • How Unity can help make industrial XR experiences a reality.

To download your free ebook, fill out the form on this page. Your download is sponsored by Unity Technologies.

The post 6 Best Practices When Developing XR for Industrial Applications appeared first on Engineering.com.

]]>
What are the roles of sensors and actuators in IIoT? https://www.engineering.com/what-are-the-roles-of-sensors-and-actuators-in-iiot/ Mon, 07 Oct 2024 19:48:25 +0000 https://www.engineering.com/?p=132533 Sensors are the eyes and ears of your operation and actuators are the hands.

The post What are the roles of sensors and actuators in IIoT? appeared first on Engineering.com.

]]>

Every manufacturing engineer considering an IIoT implementation should put considerable focus into how the systems contribute to data collection, real-time decision-making and automated control within the production environment.

Sensors are the eyes and ears of your operation. These data collection devices continuously monitor various physical or environmental parameters on the shop floor. Sensors have been developed to measure almost any condition on the shop floor. Here are some common types:

Temperature (for controlling furnaces or ovens)

Pressure (for monitoring hydraulic or pneumatic systems)

Vibration (for detecting imbalance in motors or machinery)

Humidity (for ensuring optimal conditions in certain manufacturing processes)

Proximity (for part detection on a conveyor belt or pallet)

Torque and Force (for ensuring precise assembly or machining)

These days, most sensors provide real-time data that are essential for understanding the status of machines, the health of equipment and the quality of products.

Sensors can capture data continuously or at regular intervals, feeding it back to a centralized system or edge devices. This data allows you to monitor machine performance and production quality in real-time. By continuously monitoring conditions such as temperature, vibration and pressure, sensors can help predict equipment failures before they happen—enabling predictive maintenance strategies. This minimizes downtime and unplanned repairs. Sensors can also ensure product quality by tracking parameters such as size, weight or chemical composition, ensuring products are within acceptable tolerances.

The data collected by sensors is sent to centralized cloud systems or edge devices for real-time analysis, enabling manufacturers to make informed decisions on production adjustments and process improvements.

Actuators: The Hands of Your IIoT System

Once sensors collect and transmit data, actuators play the critical role of executing actions based on the data received. Actuators are devices that respond to control signals by performing physical tasks, including:

Opening or closing a valve (to control fluid or gas flow in a pipeline)

Adjusting motor speeds (for conveyor belts or robotic arms)

Turning machines on or off (for automated start/stop of equipment)

Controlling temperature (by activating heating or cooling systems)

Moving robotic arms or equipment (for assembly, material handling or other precision tasks)

In an IIoT system, actuators are responsible for automating responses to specific conditions detected by sensors. This creates the foundation for closed-loop control systems that can operate independently of human intervention. For example, if a temperature sensor detects overheating, the actuator could activate a cooling system without manual intervention. This automation reduces human labor and the chances of errors or inefficiencies in production. It also speeds up response times to deviations, minimizing waste and downtime.

Actuators can also adjust machine settings dynamically. For example, based on real-time data, they can modify the speed or pressure of a machine, ensuring the production process adapts to the changing needs of the workflow.

In more advanced IIoT setups, edge computing and AI-driven algorithms use sensor data to make autonomous decisions, triggering actuators without human oversight. This could be as simple as adjusting a process or as complex as rerouting products based on real-time data streams.

Working together in IIoT

In a typical IIoT system, the interaction between sensors and actuators follows a continuous cycle of data collection and response, which is often referred to as closed-loop control. Here’s an example:

Sensors detect changes: A temperature sensor detects that the temperature in a furnace is rising above the set threshold.

Data is sent: The sensor transmits this information to the controller (either an edge device or cloud platform) in real-time.

Data is analyzed: The controller analyzes the data and determines that corrective action is needed (e.g., the furnace is overheating).

Actuator takes action: Based on the analysis, the controller sends a signal to an actuator that opens a valve to release cooling air or turns on a cooling system.

Process adjustment: The actuator performs the task, and the sensor continues to monitor the process, feeding back data to ensure the temperature returns to safe levels.

Benefits of sensors and actuators in manufacturing

Increased Production Efficiency:

Sensors and actuators enable real-time adjustments to processes, ensuring that machines operate within optimal parameters. This minimizes downtime and keeps production flowing smoothly.

Enhanced Predictive Maintenance:

Continuous data from sensors allows for early detection of wear and tear or impending failures, reducing the need for reactive maintenance and minimizing unexpected breakdowns. Actuators can automatically adjust processes to prevent equipment damage.

Improved Quality Control:

Sensors track key quality metrics, and actuators can adjust the process instantly to ensure product quality remains consistent, reducing waste and scrap.

Operational Flexibility:

Sensors and actuators provide greater control over manufacturing systems, enabling them to respond flexibly to changes in production schedules, environmental factors, or even supply chain disruptions.

Cost Reduction:

Automation through sensors and actuators can lower labor costs and reduce human error. Moreover, optimized processes lead to less material waste, contributing to overall cost savings.

Data-Driven Decision Making:

By integrating sensors and actuators with a central data system (cloud or edge-based), manufacturers can leverage real-time analytics to gain actionable insights and make informed decisions to improve efficiency and productivity.

Common challenges

Let’s face it, maintaining a network of sensors and actuators and similar technology in a manufacturing environment can be tricky. Many environmental and workflow factors can result in degraded performance, even if they aren’t integrated into a broader IIoT implementation.

However, in IIoT manufacturing systems, several challenges are directly related to the integration of sensors and actuators into the broader industrial network. One key issue is communication latency and bandwidth limitations. IIoT systems rely heavily on real-time data transfer between sensors, actuators and control systems. Latency or insufficient bandwidth can delay data transmission or actuator responses, which is particularly troublesome in time-sensitive applications where quick reactions are essential.

Another challenge is connectivity and reliability issues. Since IIoT systems often involve wireless communication (e.g., Wi-Fi, LPWAN, or other IoT protocols), connectivity problems like signal dropouts, weak coverage or protocol incompatibility can disrupt the flow of critical data. In a networked environment, these disruptions can lead to missed sensor readings or commands not reaching actuators, causing downtime or unsafe conditions.

The sheer volume of data generated by IIoT devices can also lead to data overload and management challenges. With sensors constantly transmitting data, storage and processing systems can quickly become overwhelmed, making it difficult to extract actionable insights or react quickly to system needs. This can hinder operational efficiency, slow decision-making, and complicate data analysis.

Security vulnerabilities are another significant concern in IIoT systems. As sensors and actuators become more interconnected, they are exposed to potential cyber threats. Hackers could access the network to manipulate sensor data or control actuators, posing serious risks to both data integrity and physical safety.

Lastly, sensor and actuator compatibility can be an issue when integrating devices from different manufacturers or upgrading legacy systems. IIoT environments require seamless communication between different components, and incompatible sensors, actuators or communication protocols can lead to integration problems, system inefficiencies or even failures in real-time operations.

To address these challenges, best practices include using real-time networking protocols, implementing strong cybersecurity measures, employing edge computing to process data closer to the source, and ensuring that systems are compatible and interoperable across the IIoT network. These steps help ensure that the IIoT infrastructure operates reliably and efficiently.

The post What are the roles of sensors and actuators in IIoT? appeared first on Engineering.com.

]]>
What are the connectivity considerations in an IIoT implementation? https://www.engineering.com/what-are-the-connectivity-considerations-in-an-iiot-implementation/ Fri, 04 Oct 2024 15:18:53 +0000 https://www.engineering.com/?p=132475 Connectivity is the foundation of any Industrial Internet of Things (IIoT) implementation. For engineers, it’s not just about ensuring that devices and systems can talk to each other; it’s about choosing the right network architecture, protocols and security strategies to meet operational goals. In IIoT, connectivity refers to the ability of machines, sensors and control […]

The post What are the connectivity considerations in an IIoT implementation? appeared first on Engineering.com.

]]>

Connectivity is the foundation of any Industrial Internet of Things (IIoT) implementation. For engineers, it’s not just about ensuring that devices and systems can talk to each other; it’s about choosing the right network architecture, protocols and security strategies to meet operational goals.

In IIoT, connectivity refers to the ability of machines, sensors and control systems to communicate over networks. This enables real-time data exchange and interaction between devices, local networks, edge systems and centralized cloud platforms. In IIoT implementations, this connectivity is critical to enabling the flow of data needed for process optimization, predictive maintenance, remote monitoring and real-time decision-making.

IIoT devices can range from sensors to actuators to industrial machines. For devices to exchange data directly, you’ll typically use machine-to-machine (M2M) protocols. Engineers must ensure that these devices can communicate over low-latency and robust protocols that handle the real-time data flows characteristic of industrial environments.

Protocols like Modbus, OPC UA, and MQTT are industry standards used in IIoT for device-to-device communication. While Modbus, OPC UA, and MQTT are indeed the cornerstones of IIoT protocols, there are many other protocols to choose from depending on the application, environment and system requirements. Each protocol comes with its own set of strengths and weaknesses, so it’s important to assess performance, security, scalability and interoperability when selecting a protocol for your IIoT architecture.

Another consideration is protocol overhead, which is the extra information that communication protocols add to manage data transmission, handle security, ensure data integrity and support real-time operation. While necessary for reliable, secure communication, overhead can reduce bandwidth efficiency, increase latency and consume more power, which is especially problematic in IIoT environments. Understanding and managing protocol overhead is essential for optimizing performance and efficiency in IIoT implementations.

Edge connectivity

Edge devices (often called edge gateways or edge controllers) act as intermediaries between the industrial devices and the cloud. They handle preprocessing and data aggregation before sending relevant information upstream.

Implementing edge computing reduces latency, conserves bandwidth and allows for real-time decision-making at the device level. Edge architecture must be scalable and secure, often integrating with local databases or edge AI algorithms to run complex analytics.

Cloud connectivity and platform integration

IIoT relies heavily on cloud-based platforms for long-term data storage, aggregation, advanced analytics and remote monitoring. Cloud platforms offer scalable environments for handling data streams from devices in the field.

Ensuring reliable connectivity between edge nodes and the cloud is vital. Engineers should also focus on data integrity and network reliability, optimizing data protocols to reduce packet loss and latency.

Common protocols and data handling

MQTT is lightweight, supports real-time data and works well in low-bandwidth environments, making it ideal for IIoT where data volumes can be massive but not all data needs to be sent in real-time.

OPC UA is widely used in industrial settings for real-time data exchange between PLCs and other industrial automation equipment. It also supports security, which is a critical concern in industrial systems.

RESTful APIs or HTTP/HTTPS are more suitable for web-based interfaces or when integrating IIoT with existing enterprise IT systems but may not offer the real-time capabilities needed for certain mission-critical operations.

How to Address Connectivity Challenges

Industrial environments can be challenging for connectivity due to electromagnetic interference, harsh environments and network congestion. Implement redundant networks (dual Ethernet, cellular backup) for failover in case of primary network failures. Mesh networking in IIoT can increase reliability in environments with intermittent connectivity.

Engineers will often deal with scaling from dozens to thousands of devices over a large geographical area. To support this, it’s important to architect networks that can grow without compromising performance. This may involve local edge computing to handle localized data aggregation and minimize bandwidth requirements.

Security is paramount in IIoT, especially when sensitive operational data and critical infrastructure are involved. Use end-to-end encryption (TLS, AES) and secure communication protocols (like OPC UA with security features enabled). Additionally, ensuring device authentication, role-based access control and network segmentation can help protect against cyber threats.

Zero-trust architectures are becoming increasingly popular in industrial networks to ensure that no device or user is implicitly trusted.

Latency and bandwidth optimization

Low latency is crucial for time-sensitive operations, such as real-time control or automated responses in manufacturing. For example, 5G and LPWAN (Low Power Wide Area Networks, such as LoRaWAN) are being explored for IIoT because they offer low latency, high bandwidth and long-range communication capabilities.

You should also look at how data is being transmitted. Use data compression, aggregation and edge processing to reduce the volume of data being sent over the network.

Technologies enhancing IIoT connectivity

With the advent of 5G, IIoT is gaining a huge advantage in terms of bandwidth and low latency. 5G allows for high-density device support and real-time communication, ideal for applications like autonomous vehicles, smart grids and advanced robotics in factories.

For environments where power efficiency is crucial and devices are spread across large areas, such as farms, pipelines or smart cities, LPWAN protocols offer extended range and low power consumption with relatively low bandwidth needs.

Edge computing reduces the need to send every bit of data to the cloud, providing a more efficient means of processing high volumes of data locally. This can include real-time anomaly detection or local decision-making that reduces latency and bandwidth needs.

Best practices for IIoT implementation

In industrial settings, systems and machines from multiple manufacturers may need to communicate with each other. Ensure your connectivity infrastructure allows for interoperability through open standards (like OPC UA) and modular architectures that can easily integrate with third-party equipment.

Track all data flows and network performance with network monitoring tools and data governance frameworks. This will help in troubleshooting, performance tuning and meeting compliance standards.

Architect your IIoT system in a modular way so new devices or protocols can be integrated without requiring a full system redesign. This modularity supports future-proofing the system as new technologies emerge.

For engineers implementing IIoT, connectivity is a multi-faceted challenge that involves choosing the right protocols, designing reliable and secure networks, optimizing for scalability and latency and ensuring devices can communicate efficiently across systems. The foundation for a successful IIoT implementation lies in robust, scalable and secure connectivity, enabling real-time data flow, remote monitoring and proactive decision-making.

The post What are the connectivity considerations in an IIoT implementation? appeared first on Engineering.com.

]]>
Processing at the edge takes off https://www.engineering.com/processing-at-the-edge-takes-off/ Tue, 01 Oct 2024 19:57:50 +0000 https://www.engineering.com/?p=132351 Your data lives at the edge and determining how to connect processes can improve manufacturing and monitoring.

The post Processing at the edge takes off appeared first on Engineering.com.

]]>
The Nvidia IGX Orin platform (left) is used in healthcare, industrial inspection and robotics (from top to bottom, on
right). Source: Nvidia

Real-time and near real-time processing at the edge is more common than ever, thanks to improvements in chips and batteries. Yet a variety of logistical and technical problems present challenges for companies engaging in such processing. Fortunately, every instance of such work presents opportunities for these businesses to learn more from themselves and one another.

Implementing industry 4.0 practices in real-time and near real-time processing at the edge requires evaluating how current procedures can be improved. Beneficial changes enable companies to handle numerous scenarios that relate to interconnected procedures. For example, ensuring there is adequate security at the edge is best accomplished as a team goal between business partners. This goal can utilize two or more tools, such as encryption and two-factor authentication.

Recent changes that have increased the amount of real and near real-time processing at the edge include a current capability of up to 20 trillion operations per second (TOPS) for standard semiconductors, as opposed to a single TOPS a few years ago; faster speed and lower power consumption in different networks, from Long Range Wide Area Network (LoRaWAN) to 5G; and better software, including more Artificial Intelligence (AI) models, as well as new data sets and tools.

“The edge is where the data come from. Bringing the processing to companies working in these spaces is the goal. Such action can bring deployment time down by as much as a third, like from 18 months to six months. That presents cost savings and better opportunities to leverage AI,” says Pete Bernard, Executive Director of tinyML Foundation.

tinyML is a Seattle-based nonprofit that focuses on low power AI at the edge of the cloud. Its members include large corporations, including Qualcomm and Sony, academic institutions like Johns Hopkins University and nongovernmental organizations.

“tinyML holds frequent events to build community around the concept of the edge. We educate people about the potential of working at it. Our programs include contests, conferences, hackathons and workshops. One of the concepts we are considering now is data provenance,” says Bernard.

This idea relates to the watermarking of data sets and models. AI models must be trained on data sets. Stamping provenance helps users identify sources of data and the developers behind them. Such work makes it easier to integrate different data sets and models.

Software for the edge

Simplifying edge operations is easier to accomplish with software designed for that purpose, like Dell’s NativeEdge platform. 

Dell’s NativeEdge platform helps enterprises work with data generated at the edge. Source: Dell

“With NativeEdge, a client can build an AI model to operate at the edge. They can retrain the model onsite at the edge. This saves money and gives them the ability to scale up the solution as needed,” says Pierluca Chiodelli, Vice President, Edge Engineering and Product Management at Dell Technologies.

Dell sees security as the biggest challenge for clients.

A company that tries to do everything itself runs the risk of exposing information. Any entity that generates data must protect the data at the points where the data is created and stored.

Dell is enhancing security by working closely with NVIDIA, which developed the AI Enterprise software integrated with NativeEdge’s engine.

“Inference at the edge, which involves gathering data with AI techniques, is really important. Everybody needs to have a way to deploy and secure that. Also a company has to maintain its AI stack, the tools and services to use AI correctly. It must have a blueprint to update all the pieces of the puzzle,” says Chiodelli.

As the different components of an AI stack can change, a company must be aware of all of them and how they interact. This helps the company make the necessary adjustments in proportion and on the appropriate timeline. Such work prevents deviations in manufactured products and slowdowns in production time. It also minimizes the time needed to retrain AI models and workers.

The market for the edge is growing

Nvidia is working on numerous hardware and software applications to meet the needs of companies utilizing edge computing. The company sees this market as expanding. A March 2024 forecast from the International Data Corp. stated worldwide spending on edge computing is expected to be $232 billion this year.

One of Nvidia’s platforms for the edge is the Nvidia IGX Orin with NVIDIA Holoscan, which is designed for real-time AI computing in industrial and medical environments. This platform provides high performance hardware and enterprise AI software. The platform is for companies working in robotics, healthcare, scientific research, video analytics and broadcasting.

In scientific computing, the Nvidia IGX Orin with Holoscan platform has the power to stream high-bandwidth sensor data to the GPU. It can use AI to detect anomalies, drive sensor autonomy and lower the time to scientific insights. In the medical space, Magic Leap has already integrated Holoscan in its extended reality (ER) software stack to enhance the capabilities of customers. This has allowed one of its clients in software development to provide real-time support for minimally invasive treatments of stroke.

It’s difficult to establish interoperability across systems, says Chen Su, Senior Technical Product Marketing Manager of Edge AI and Robotics for Nvidia.

 “Today there are numerous installed legacy systems that weren’t originally designed with AI capabilities in mind. Integrating AI into those systems and still achieving real-time performance continues to pose a significant challenge. This can be overcome by developing industry-wide standards that can meet the complex connectivity requirements across sensors, actuators, control systems and interfaces,” says Su.

Once the task above is accomplished, the entire edge AI system will have no bottleneck in communication. It can then act in a software-defined manner, making the system more flexible and easier to manage.

STMicroelectronics (ST), a global manufacturer and designer of semiconductors, meets the needs of companies that process data in real-time and near real-time with a variety of edge AI tools and products.

These include STM32 and Stellar-E for microcontrollers (MCU) edge AI pure software solutions; the incoming STM32N6, a high-performance STM32 MCU with ST proprietary Neural Processing Units (NPU) and the STM32MP2 microprocessor series.

Danilo Pau, Technical Director in System Research and Applications at STMicroelectronics, says advances in embedded AI computing that enable processing at the edge require higher energy efficiency. The task is made possible by a mix of assets, including super-integrated NPU accelerators, In Memory Computing (IMC) and 18nm Fully Depleted Silicon On Insulator (FD-SOI) ST technologies. Such resources can be super integrated close to standard MCU and Memory Processing Unit (MPU) cores for viable, high volume low-cost manufacturing.

“There is also the super-integration of heterogeneous technologies in a single package achieved by Intelligent Sensor Processing Unis (ISPU) and Machine-Learning Core (MLC) product families. In a tiny package, micro-electromechanical systems (MEMs) sensors, analog and digital technologies are stacked for large and cheap sensor volumes. They engage in microwatt power consumption. This is a fundamental contribution that enables the incoming trillion of sensor economies envisaged by many IoT experts,” says Pau.

Organizations like tinyML Foundation play an important role in the business community. Since 2018, tinyML has encouraged many companies to invest in generative AI at the edge (edgeGenAI).

Pau says there is need of even greater energy efficiency and super integration of heterogeneous technologies, including NPU, IMC, deep submicron technologies and sensors.

“The vision is to design embedded systems that match the energy efficiency of the human brain,” says Pau.

He adds companies will increasingly need more education about edge AI technologies, tools and mastery of skills.

That fact explains why ST, which is currently Europe’s largest designer and manufacturer of custom semiconductors, is an active part of the tinyML community.

“ST works with many actors in the edgeGenAI ecosystem. We’re eager to see this ecosystem expand and serve AI developers in the best and most productive way. That will ease their path in bringing innovation to the edge AI market,” says Pau.

The post Processing at the edge takes off appeared first on Engineering.com.

]]>
How quantum computing is already changing manufacturing https://www.engineering.com/how-quantum-computing-is-already-changing-manufacturing/ Tue, 24 Sep 2024 13:23:55 +0000 https://www.engineering.com/?p=132147 The prospects for hybrid quantum optimization algorithms in the manufacturing industry are particularly promising.

The post How quantum computing is already changing manufacturing appeared first on Engineering.com.

]]>
A laser setup for cooling, controlling, entangling individual molecules at Princeton University. (Image: National Science Foundation/Photo by Richard Soden, Department of Physics, Princeton University).

Various industries are becoming increasingly aware of the potential of quantum technology and the prospects for the manufacturing industry are particularly promising. There are already quantum algorithms being used for specific manufacturing tasks. These are hybrid algorithms that combine quantum calculations and conventional computing—in particular high-performance computing. As the first benefits of quantum technology are already being realized today, it’s worthwhile for companies to familiarize themselves with the technology now.

Where quantum algorithms fit manufacturing

To find suitable use cases, it’s helpful to know about one of the most popular hybrid algorithms: the Quantum Approximate Optimization Algorithm (QAOA). QAOA is considered a variational algorithm and is used to find solutions to optimization problems. A Variational Quantum Algorithm (VQA) is an algorithm based on the variational method, which involves a series of educated guesses performed by a quantum computer refined by classical optimizers until an approximate solution is found. This iterative process combines classic computers with quantum computers, allowing companies to access the benefits of quantum computing more quickly, rather than waiting for technological breakthroughs that may not happen for several years. 

Hybrid quantum algorithms open creative possibilities for challenges in manufacturing. For example, it will be possible to develop new, better materials by simulating the interaction of molecules more reliably and quickly. Classical computers already struggle to simulate simple molecules correctly. Since quantum computers can explore several possible paths simultaneously, they are better able to calculate complex interactions and dependencies. This reduces the cost and time required to research and produce innovative materials – which is particularly promising for the development of better batteries for electric cars.

Quantum calculations can also make a difference in logistics and inventory management, where, the “traveling salesman problem” is a recurring challenge: what is the shortest route to visit a list of locations exactly once and then return to the starting point? When solving this type of problem, quantum computers are significantly faster than traditional systems. Even with just eight locations, a traditional computer needs more than 40,000 steps, where a quantum computer solves in 200 steps. Those who firmly integrate such calculations into work processes will be able to save a lot of time and resources.

The situation is similar for supply chains. Maintaining one’s supply chain despite geopolitical upheavals is increasingly becoming a hurdle for the manufacturing industry. Remaining flexible is easier said than done, as changing suppliers can quickly lead to delays in the workflow. Although most manufacturers have contingency plans and replacement suppliers at the ready, the market is convoluted. Huge amounts of data must be considered to find the cost-optimal and efficient supply chain. Quantum algorithms can handle this and allow ad hoc queries of this kind, which is a decisive advantage in volatile situations.

Approaching the quantum advantage

Hybrid quantum algorithms can be used in a variety of ways. Volkswagen, for example, found a use case in the application of car paint and was able to optimize this process. It was possible to reduce the amount of paint used and speed up the application process at the same time. 

Some practices help manufacturers to enter quantum computing via hybrid quantum algorithms. Although the full quantum advantage will only unfold in the future, awareness of the technology’s potential is important today. Now is the best time to actively engage with quantum computing and identify industry-specific use cases. This makes it possible to estimate the complexity of the problems and the computing power required. This in turn makes it easier to estimate when the right hardware might be available.

Once suitable application scenarios have been found, there is no need to wait for the ideal quantum hardware. Instead, manufacturers should try their hand at a simplified program for a specific scenario and combine the latest quantum technology with conventional systems. At best, this hybrid approach can achieve a proof of concept and realize tangible improvements – Volkswagen is a good example of this.

It’s also important to note that it’s usually not necessary to learn programming for quantum computing at the machine language level. There are already higher-level programming languages that are less abstract and complex and therefore easier to learn. The market also has platforms that represent quantum-based applications via graphical user interfaces. These can help development teams show these applications to other departments and make them easier to understand. It’s advisable to focus on platforms that are cloud-based and agnostic in terms of hardware. It’s currently still unclear which hardware will prevail in quantum computing. Flexibility is therefore particularly valuable to minimize conversion costs, which can be incurred with on-premise installations.

A strategic investment

Even with the most innovative technologies, big changes don’t happen overnight. While we will see leaps towards the full quantum advantage, it will also take time to be fully applicable. The bottom line is that those who have prepared themselves earlier will be able to utilize the quantum advantage sooner. The transition to quantum computing can be a challenge if not enough groundwork has been done. A smooth transition is possible if employees are trained in the use and maintenance of quantum systems.

The introduction of hybrid quantum algorithms is also strategically valuable due to potential patent applications. Only an early discovery of industry-specific quantum applications allows manufacturers to quickly fill their portfolio and legally secure this intellectual property.

Erik Garcell is head of technical marketing at Classiq Technologies, a developer of quantum software. He has a doctorate in physics from the University of Rochester and a master’s in technical entrepreneurship and management from Rochester’s Simon School of Business.

The post How quantum computing is already changing manufacturing appeared first on Engineering.com.

]]>
What are the key aspects of IIoT? https://www.engineering.com/what-are-the-key-aspects-of-iiot/ Wed, 04 Sep 2024 18:14:25 +0000 https://www.engineering.com/?p=131490 The industrial Internet of Things is playing a pivotal role in shaping the future of manufacturing. Here we explore what it is and how it all started.

The post What are the key aspects of IIoT? appeared first on Engineering.com.

]]>

At its core, the industrial Internet of Things (IIoT) is about infusing traditional industrial environments with advanced digital technology. Sensors and smart devices are embedded in machinery to continuously collect data on everything from temperature to vibration. These sensors actively monitor and report on the performance and condition of equipment in real-time.

This data may seem like just a flood of numbers, but with the right mindset it’s a treasure trove of potentially actionable insights. In a well thought out implementation, advanced analytics and machine learning algorithms sift through this data, uncovering patterns and trends that were previously hidden. This means that rather than waiting for a machine to break down, manufacturers can now predict when a failure might occur and address it before any disruption. This proactive approach helps to reduce unexpected downtime and extend the lifespan of equipment.

The connectivity that IIoT brings means managers can adjust processes on the fly, optimize resource use and even automate many aspects of production. This level of automation boosts efficiency and enhances productivity, allowing for more streamlined operations and higher output.

Cost savings are another significant benefit of IIoT. By minimizing unplanned maintenance and optimizing energy consumption, manufacturers can reduce their operational expenses. Predictive maintenance, for example, ensures that equipment is serviced only when needed, rather than on a fixed schedule or after a failure.

Moreover, IIoT introduces a new level of flexibility into manufacturing. Factories equipped with IIoT technology can adapt more easily to changes in designs, demand or shifts in production requirements. Ideally, manufacturers can quickly reconfigure their operations or scale them up or down based on real-time needs, making them more responsive to market fluctuations.

Safety and regulatory compliance are also enhanced through IIoT. The continuous monitoring of equipment helps identify potential hazards before they become serious issues, creating a safer working environment. Additionally, accurate data collection supports compliance reporting with safety standards and regulations.

Additionally, wider supply chains benefit from the integration of IIoT. With better tracking and management capabilities, manufacturers can improve logistics decisions and inventory management, ensuring that materials and products move seamlessly through the supply chain.

In essence, IIoT has the potential to transform traditional manufacturing into a dynamic, data-driven environment. It’s turning factories into smart, connected ecosystems where every machine and process is in constant communication, leading to smarter decisions, greater efficiency, and a more agile and responsive production environment.

Evolution of IIoT: PLCs set the stage

The roots of IIoT can be traced back to the mid-20th century when electronic controls and automation began to take shape. The introduction of programmable logic controllers (PLCs) in the 1960s marked a significant leap forward, allowing machines to be controlled with greater precision and flexibility.

The 1980s and 1990s saw the integration of computer technology into industrial environments. The advent of personal computers and advancements in software led to the development of more sophisticated control systems. Manufacturing Execution Systems (MES) emerged, providing real-time data on production processes and improving operational efficiency. However, these systems were often isolated and lacked the connectivity seen in modern systems.

Enter IoT

The concept of the Internet of Things (IoT) began to take shape in the early 2000s, thanks to the proliferation of internet connectivity and sensor technology. Kevin Ashton coined the term “Internet of Things” in 1999 while working at Procter & Gamble, envisioning a future where everyday objects could communicate over the internet. This concept initially focused on consumer applications but laid the groundwork for what would become IIoT.

The early 2010s marked the formal emergence of IIoT as a distinct concept. As broadband internet and wireless technologies matured, Machine builders began to integrate internet connectivity into industrial machinery and processes. The introduction of smart sensors, which collected and transmitted data about various operational parameters, was a game-changer. These sensors, coupled with advances in cloud computing and big data analytics, enabled real-time monitoring and analysis of industrial processes on an unprecedented scale.

Advancements and adoption

By the mid-2010s, IIoT had gained substantial traction across various sectors. The integration of advanced analytics and machine learning allowed for deeper insights and predictive capabilities. Industries from manufacturing to energy and transportation embraced IIoT to enhance efficiency, reduce downtime and optimize operations. The development of edge computing, which processes data closer to the source rather than relying solely on centralized cloud servers, further accelerated IIoT adoption by reducing latency and improving responsiveness.

Current state and future developments

Today, IIoT is a cornerstone of Industry 4.0, the fourth industrial revolution characterized by digital transformation and smart technologies. Modern IIoT systems leverage a combination of sophisticated sensors, advanced analytics and interoperable platforms to create highly efficient and adaptive industrial environments. Innovations such as digital twins—virtual replicas of physical systems—allow for simulation and optimization of industrial processes in real-time.

Looking ahead, the evolution of IIoT continues with advancements in AI and 5G connectivity, which promises even faster data transmission and more nuanced, automated analysis. As industries strive for greater automation, efficiency and sustainability, IIoT is expected to play an increasingly pivotal role in shaping the future of manufacturing and beyond.

The post What are the key aspects of IIoT? appeared first on Engineering.com.

]]>
The 5 layers of digital transformation https://www.engineering.com/the-5-layers-of-digital-transformation/ Fri, 19 Jul 2024 16:58:33 +0000 https://www.engineering.com/?p=52440 How to think about digital integration and transformation within a company or process.

The post The 5 layers of digital transformation appeared first on Engineering.com.

]]>

Embarking on digital transformation for an aerospace manufacturing company signifies a strategic shift towards integrating advanced digital technologies across all facets of operations.

This includes using technologies such as Industrial Internet of Things (IIoT) for real-time monitoring of equipment and systems, implementing artificial intelligence (AI) and machine learning algorithms for predictive maintenance and optimized production scheduling and adopting digital twins to simulate and optimize the performance of aircraft components and systems.

The digitalization pyramid

The digitalization pyramid is a conceptual framework used in industrial and organizational contexts to illustrate the levels of digital integration and transformation within a company or process.

It consists of several layers or stages, each representing different aspects of digitalization. While variations exist, a common representation includes the following layers:

Data collection: The base layer of the pyramid involves the collection of raw data from various sources within the organization or across the value chain. This data can come from sensors, machines, devices, databases or virtually any system that collects data.

Data integration: The next layer is about integrating and consolidating the collected data into a unified format or system. This stage ensures that data from different sources can be accessed, processed and analyzed.

Data analysis: You guessed it. This layer is about analyzing the integrated data to derive insights, trends, patterns and actionable information. Techniques such as statistical analysis, machine learning and artificial intelligence are a natural fit here.

Digitalization: This layer involves the transformation of business processes and operations using digital technologies and insights gained from data analysis. It includes automation, optimization and the use of digital tools to streamline workflows and improve efficiency.

Digital transformation: This last phase is the goal of the entire exercise and represents the strategic adoption of digital technologies to fundamentally change how a business operates, delivers value to customers and competes in the market. It may involve new business models, innovative products or services and a shift towards a more data-driven and agile organization.

This is a basic roadmap for organizations looking to evolve and harness the power of digital technologies, but nothing about this process is basic. Each one of these phases is made up of many complicated initiatives and no company can do this properly without good partners in the process.

What’s the difference between digitization and digitalization?

The terms “digitization” and “digitalization” are related but have distinct meanings in the context of technology and business transformation:

Digitization refers to the process of converting information or data from analog to digital form. It involves transforming physical or analog artifacts (such as documents, images, videos or processes) into digital formats that can be digested, stored, and transmitted electronically. Examples include scanning paper documents to create digital copies, converting analog audio or video recordings into digital formats or creating digital records of interactions between machines.

Digitalization is the broader process of integrating digital technologies into various aspects of business operations, processes and strategies to fundamentally change how they operate and deliver value to customers. It relies on digital technologies (like AI, IoT, cloud computing, data analytics) to improve efficiency, create new business models, enhance customer experiences and innovate within an organization. Some examples would be implementing IoT sensors to gather real-time data for predictive maintenance, using AI algorithms to automate decision-making processes, adopting cloud-based solutions for scalable operations or redesigning customer interactions through digital channels.

The post The 5 layers of digital transformation appeared first on Engineering.com.

]]>
How ‘AI at the edge’ enables decision superiority on today’s battlefield https://www.engineering.com/how-ai-at-the-edge-enables-decision-superiority-on-todays-battlefield/ Fri, 19 Jul 2024 03:29:55 +0000 https://www.engineering.com/?p=52425 Thanks to recent technological advances, militaries can co-locate sensors and ruggedized, AI-powered supercomputers in tiny spaces across big networks – and acquire the information advantage they crave.

The post How ‘AI at the edge’ enables decision superiority on today’s battlefield appeared first on Engineering.com.

]]>
In October 2020, the U.S. Army and U.S. Air Force announced the Combined Joint All-Domain Command and Control (CJADC2) Implementation Plan, a unified effort to provide strategic advantage in large-scale, multidomain battles. Deemed a warfighting necessity by the U.S. Department of Defense, the CJADC2 framework helps inform modern warfare initiatives by better managing data volume and complexity. This initiative is designed to give U.S. military personnel a tactical advantage and provide the same across and within allied and partner nations, promoting a more collaborative and effective multinational defense strategy.

The CJADC2 concept creates a secure, unified command and control network across all defense branches and among allied partners. (Image: Aitech)

CJADC2 aims to create a multinational framework for integrated command and control that enables military forces to sense, make sense and act, in the DOD’s words, “at all levels and phases of war, across all domains, and with partners, to deliver information advantage at the speed of relevance.” In other words, it gives them decision superiority: the ability to assimilate, analyze and respond to information acquired from the battlespace more rapidly than an adversary. 

While this definition captures what CJADC2 aims to achieve, it says little about how to achieve it. However, some lessons learned in recent conflicts have been integrated with this data-driven warfighting concept.

Growing Use of Digitized Information

CJADC2 uses artificial intelligence (AI), machine learning (ML), decision autonomy and other advanced capabilities to better connect sensors with shooters (e.g., soldiers, tanks, UAVs) and reduce the time it takes to bring lethal and non-lethal effects against an adversary to influence multidomain operations. Not surprisingly, AI-based, compact supercomputers designed to manage a growing amount of data and inputs are increasingly used in military and defense operations. By using this compact, rugged computing technology incorporated directly into today’s defense platforms, military operations gain better intelligence, faster, which leads to more successful outcomes.

Throughout history, decision superiority has always been crucial to winning or losing battles, with success destined for those who can best leverage and secure information to make the best decision in the shortest time. Military conflicts in the 21st century will continue to utilize this strategy at an accelerated pace, thanks to advancements in AI and data processing.

Objectives such as lowering the cognitive load of soldiers and decision-makers and decreasing the response time to gain an advantage represent just some of the requirements, risks and technical challenges being addressed.

The Importance of Shared Intelligence

Advances in telecommunications, sensors, processing power and weapons, along with the growing utility of space and cyberspace as operational domains, have fundamentally shifted the character of command and control in warfare. Data is the new strategic asset that is employed enterprise-wide in multidomain operations to achieve a holistic approach.

The benefits of networked communication include:

  • Streamlining of large data transfers from sensors to mission computer
  • Improved system response time
  • Reduced wiring complexity increases system reliability, availability, maintainability
  • Improved upgradeability and scalability

One iteration of CJADC2 focuses on creating a global targeting system that can enable combatants to locate, target and engage the enemy, then asses the results – a critical process known as the kill chain. Another looks at how CJADC2 can assist with achieving decision superiority to maneuver forces to positions of advantage to prevent an adversary from meeting their objectives. This iteration has recently been analyzed in manned and unmanned ground systems in land operations.

AI in Ground Operations

One of the most complex ground-based maneuvers is a wet-gap crossing. However, there are distinct logistical challenges in planning and executing these critical operations. When successfully executed, a wet-gap crossing operation can provide one of the most valuable assets in war – speed – to seize the initiative, prevent enemy reconnaissance, and exploit success. Executing a safe and efficient wet-gap crossing allows friendly forces to set the necessary conditions for further success.

 Artificial intelligence and machine language could significantly reduce risk in complex military maneuvers like wet-gap crossings. (Image: Aitech)

A recent analysis of a failed wet-gap crossing by Russian forces in eastern Ukraine over the Siverskyi Donets River highlighted many challenges and risks associated with this complex operation and identified potential technical solutions using AI/ML and other critical technologies.

Information Flow Improves Risk Analysis

Since most future breaching operations will likely be conducted using unmanned or optionally manned systems, large amounts of data must be secured and transmitted across tactical networks to synchronize reconnaissance and security, logistics and other warfighting functions.

At the macro level, CJADC2 involves gathering massive quantities of data through a broad range of distributed sensors and processing it into actionable information. The system is stitched together with a robust set of communication links that allocate the correct information across the network to enable organizations to achieve enhanced effects in their specific areas of responsibility.

The OODA Loop, a well-known and accepted decision-making model, describes a four-step process for executing combat operations: Observe, Orient, Decide, Act. Developed by U.S. Air Force Col. John Boyd, it emphasizes the importance of speed and agility in decision-making and action-taking to complete the loop as quickly and efficiently as possible so that you can adapt to changing circumstances and take advantage of opportunities as they arise.

“AI at the edge” can accelerate the OODA Loop used in military operations. (Image: Aitech)

Deploying AI algorithms on devices physically close to the data source – an approach known as “AI at the edge” (AIAE) – allows decisions to be made with minimal latency and provides flexibility in a rapidly changing environment. For example, connecting sensors directly to the AIAE unit will greatly reduce latency between the observe and orient steps in the OODA Loop.

It is the same for significantly reduced latency between the orient and decide steps because there’s no need to send out large amounts of data for additional decision-making steps to a distant node and then wait for the decision to be sent back. Sending the resulting “act” command from the AIAE unit reduces latency for the decide-act steps for the same reasons.

Decision superiority through processor design

A dominant commercial-off-the-shelf solution for AIAE processing is a general-purpose graphics processing unit (GPGPU). They can handle large amounts of data in parallel – much faster than traditional central processing units (CPUs) – thereby accelerating a wide range of AI applications.

Modules in the NVIDIA Jetson family combine AI-capable GPGPUs with multicore CPUs to create a tightly coupled, high-performance, low-power supercomputer that supports AI processing and decision-making applications software.

For example, the NVIDIA Jetson Xavier NX module provides six trillion floating point operations per second (TFLOPS) performance with a maximum power of 15 watts. This performance is comparable to that of a several hundred-watt workstation with a processor and GPU cards.

This type of computing architecture can process and apply AI algorithms for more than 20 high-definition video inputs with 1040p resolution at a rate of 30 frames per second – enough bandwidth to run AI applications for a system of multiple high-definition cameras. For defense operations, the high processing capabilities of the NVIDIA architecture enable AIAE processing, thanks to the compact supercomputers embedded within the military platform.

A ruggedized supercomputer with an NVIDIA Jetson Xavier NX module can be as small as 4” x 2.3” x 3.9”. With its low power consumption and maximum weight of 1.3 lbs., it’s an ideal candidate for AIAE applications from the perspectives of performance and SWaP: size, weight and power.

The A179 Lightning is a compact, AI-powered supercomputer that can process vast amounts of sensor data at the edge of networked military hardware. (Image: Aitech)

Other Considerations of AI in Military Operations

AIAE’s numerous benefits, such as reduced latency and increased security, also present some technological challenges, including limited processing power and storage as well as energy efficiency.

Using rugged AI supercomputing modules addresses many of these challenges, but there are also concerns of data transfer and security.

Time-sensitive networking (TSN) is a communication protocol that ensures critical information reaches decision-makers without delay by transmitting real-time data with high precision and reliability. It also facilitates the collection, aggregation, and analysis of this real-time data, empowering decision-makers with accurate, up-to-date information.

TSN synchronizes devices and systems across distributed networks to ensure that data from multiple sources is aligned and consistent. This provides a holistic view of the operational environment and enhances coordination between different components, such as sensors, actuators, and control systems, for seamless collaboration and integration.

This brings into play AIAE’s cybersecurity parameters to ensure high-performance AI-capable systems are protected from cyber and spoofing attacks, securing shared information in several ways.  These include reducing the amount of data shared across tactical networks, simplifying data distribution efforts, reducing system latency, improving data redundancy at the sensors and eliminating interoperability issues between systems since all use the same communications protocols and data messaging structures.

Leveraging AI/ML and advanced algorithmic warfare systems provides a significant decision-making advantage. Rugged, compact supercomputers can help manage the influx of data that systems must handle while providing improved intelligence in military operations.

Every system and program should mandate sensor data sharing and interoperability. This data-sharing construct can create secured battlespace awareness, in which actions in one part of the single, integrated, global battlespace can be understood and informed by actions and decisions required in other areas.

Timothy Stewart, BSME, is Director of Business Development at Aitech, which develops rugged embedded-computer solutions for industrial, military and aerospace applications. He has 20 years of experience in high-technology hardware, software and networking products. Timothy holds a BS in mechanical engineering and physics from Boston University.

The post How ‘AI at the edge’ enables decision superiority on today’s battlefield appeared first on Engineering.com.

]]>
Device Trust: Securing the Future of Smart Technology https://www.engineering.com/resources/device-trust-securing-the-future-of-smart-technology/ https://www.engineering.com/resources/device-trust-securing-the-future-of-smart-technology/#respond Tue, 19 Mar 2024 19:11:12 +0000 https://www.engineering.com/resources/device-trust-securing-the-future-of-smart-technology/ The stakes for device security and operational excellence have never been higher. For manufacturers seeking to navigate such a complex security landscape, the choice is clear: Adopt a proactive stance toward device security or risk being left behind.

The post Device Trust: Securing the Future of Smart Technology appeared first on Engineering.com.

]]>
In today’s dynamic technological landscape, the rise of connected devices is revolutionizing how we interact with the world around us. From smart homes to industrial settings, these devices offer unprecedented levels of convenience and efficiency. However, with this connectivity comes an inherent need for robust security measures.

Enterprises must adopt a security-first approach in manufacturing connected devices to mitigate potential risks effectively. This entails implementing immutable digital identities, engineering tamper resistance, and adhering to global compliance standards. These pillars form the foundation of a secure ecosystem, ensuring that devices are protected against cyber threats and unauthorized access.

Real-world applications underscore the importance of these security measures, demonstrating their tangible impact across various industries. From enhancing privacy in smart homes to safeguarding critical infrastructure, a proactive stance on security not only protects assets but also fosters trust among consumers and stakeholders. By embracing these principles, manufacturers can navigate the complexities of the connected world with confidence, paving the way for a safer and more resilient future. Learn the 4 pillars of device security in this comprehensive ebook.

Your download is sponsored by DigiCert.

The post Device Trust: Securing the Future of Smart Technology appeared first on Engineering.com.

]]>
https://www.engineering.com/resources/device-trust-securing-the-future-of-smart-technology/feed/ 0
A Guide to an Effective Code and Software Signing Policy https://www.engineering.com/resources/a-guide-to-an-effective-code-and-software-signing-policy/ https://www.engineering.com/resources/a-guide-to-an-effective-code-and-software-signing-policy/#respond Tue, 19 Mar 2024 18:53:04 +0000 https://www.engineering.com/resources/935/ This guide will help you formulate a strategy that combines software, process, and policy to achieve end-to-end security in your signing practices.

The post A Guide to an Effective Code and Software Signing Policy appeared first on Engineering.com.

]]>
In today’s rapidly evolving threat landscape, where cyber attacks and malicious software are becoming increasingly sophisticated, the importance of establishing clear and well-documented code signing policies cannot be overstated. These policies serve as a critical line of defense in safeguarding software integrity and ensuring that only trusted code is executed. However, navigating the intricacies of initiating this process can often seem like a daunting task.

Crafting a comprehensive and strategic code signing policy involves addressing various challenges, particularly in engaging busy developers who are accustomed to operating under tight deadlines. Convincing them to adhere to security measures can sometimes be met with resistance, as their primary focus tends to be on delivering results quickly rather than prioritizing security protocols.

This guide aims to provide a roadmap for organizations looking to implement effective code signing policies, drawing upon insights from DevOps experts and seasoned security professionals. By leveraging a combination of software tools, streamlined processes, and well-defined policies, organizations can establish robust security practices that span the entire software development lifecycle.

This guide, based on insights from DevOps experts and security professionals, provides a strategic approach combining software, process, and policy to ensure end-to-end security in signing practices. Learn today how you can prepare your teams and establish a strong security culture.

Your download is sponsored by DigiCert.

The post A Guide to an Effective Code and Software Signing Policy appeared first on Engineering.com.

]]>
https://www.engineering.com/resources/a-guide-to-an-effective-code-and-software-signing-policy/feed/ 0