Advanced Manufacturing - Engineering.com https://www.engineering.com/category/technology/advanced-manufacturing/ Fri, 01 Nov 2024 15:06:23 +0000 en-US hourly 1 https://wordpress.org/?v=6.6.2 https://www.engineering.com/wp-content/uploads/2024/06/0-Square-Icon-White-on-Purplea-150x150.png Advanced Manufacturing - Engineering.com https://www.engineering.com/category/technology/advanced-manufacturing/ 32 32 Focus on tech performance when training for digital transformation https://www.engineering.com/focus-on-tech-performance-when-training-for-digital-transformation/ Fri, 01 Nov 2024 15:06:21 +0000 https://www.engineering.com/?p=133532 Training must be based on ensuring the performance objectives of digital transformation are achieved.

The post Focus on tech performance when training for digital transformation appeared first on Engineering.com.

]]>

Digital transformation is the term used to describe the change that’s taking place in organizations in response to the Fourth Industrial Revolution. Until recently, organizations were built to reliably do tomorrow what they did today. They were not designed for our era of rapid technological change. For organizations to succeed in our new age, they need to innovate and respond to the rapidly changing environment they face. Many of the skills that have enabled them to be successful in the past remain important in repeatedly producing at a good price and quality, but there are a range of new skills that are needed.

In a recent survey conducted by Seatle-based IT services firm Veeam Software, 54% of organizations report that “lack of IT skills or transformation expertise” is a challenge with their digital transformation. (source: https://go.veeam.com/wp-data-protection-trends-2024 ) They feel they don’t have the basic capabilities for technology-based change. While recruitment will often be helpful in addressing this, most organizations understand that training and retaining their own people is going to be important.

Discussions and articles on digital transformation skills today help us understand the skills development areas that organizations think are important. Often, the emphasis is on specialist technical skills, change management, creativity, adaptability and broad digital literacy.

These are based on the technical needs for implementing, operating and maintaining the new technology, managing the change with employees, developing creativity that will help exploit the technology and making everyone more comfortable with and able to use it in their jobs. These areas are all relevant, but they are not sufficient for effective digital transformation. Rather than only focusing on technology introduction, training needs to be based on ensuring the performance objectives of its introduction are achieved.

Focus on strategic goals

Digital transformation should be based on the organization’s strategic goals. Strategic analysis, including understanding of the market environment, the range of technologies and their uses and the internal capabilities in the organization, is needed to determine organizational priorities for digital transformation. Introduction of technology to achieve these priorities requires a systems approach, integrating understanding across the organization to make decisions that take into account their strategic implications and possibilities.

Implementation requires collaboration within and between existing organizational silos. As implementation proceeds decisions with organization wide implications may be made – for instance on production lines where automation of one area may reduce or change work in another.

Once implementation is complete, exploitation of technologies requires more collaborative work – artificial intelligence may provide insights based on your data but this will only turn into value if you have the capability to take action based on it. In many organizations, improvement activity is slow and contentious and requires a much more supportive environment. Even with this, frequent process modifications without careful collaboration will create chaos as changes are made without adequate understanding of their implications. Collaboration and teamworking skills, supported by organizational leadership fostering a supportive environment are critical to digital transformation.

Within Lean and Agile operating systems models there is often understanding that training employees in a wider range of skills is needed to enable them to participate in innovation and continuous improvement. If each employee only knows a small part of the overall production process, they are less likely to have viable improvement ideas and be able to work with their colleagues to implement them.

In digital transformation this is even more important. Employees must have a better technical understanding, not just of new technologies, but of the processes they use themselves today and those in their general work area. Technical training on existing processes is a vital part of digital transformation.

Broader knowledge of the market environment and the organization’s overall strategy should inform all digital transformation activity, including technology focused innovation and continuous improvement. Regular, compulsory training to do this, supported by an effective communications process, is necessary to achieve an adequate level of understanding.

DX performance training categories

To help you consider the training needed for your digital transformation we have developed the following categories. Training for each work area should be considered in all of these, and plans developed to achieve the capability needed. Using the categories will enable you to reflect on the needs in each area and plan the training that each employee will receive. The categories are:

Tech Awareness: Awareness of the basic characteristics and uses of the main technologies available to organizations today. In a world where technological innovation is the main factor in competitive success, technological familiarity are essential.

Systems Awareness and Thinking: When organizations changed slowly there was plenty of time to think carefully about and adapt to the systems implications of gradual technological change. Digital transformation requires everyone to have greater knowledge of the integrational aspects of organization processes. This can be achieved through training that creates awareness of organization wide processes. Given that these processes will be changing more often in the future, this training will need to be updated and repeated regularly.

Technical Skills: Participation in process improvement that applies and exploits technology, for example making changes based on the results of AI analysis of corporate data, requires that employees have more technical understanding of their own work processes and the ability to contribute to modifications.

Collaboration and Innovation Skills: Collaboration, bringing together knowledge and skills from across the organization, is needed to effectively implement new technologies and maximise their contribution to performance.

Corporate and Market Knowledge: In most organizations successful digital transformation requires increasing empowerment at all levels. For example, acting on the increased knowledge of areas for improvement that is provided by the internet of things and AI cannot rely on all actions being approved at a senior level. Empowerment can only work if employees have a good understanding of organizational priorities (its strategy) and the context in which they are being pursued (the market).

These categories are a useful framework for considering your own training activity. Their application will be based on your own conditions. Many training options exist today that make corporate training easier. Sending people to off-site classes and losing their valuable contribution to today’s production is much less often needed than in the past. On site and / or online options are now available that enable high quality training to be much less disruptive to production and many companies are taking advantage of it. AR and VR will play an increasing role in the future.

The post Focus on tech performance when training for digital transformation appeared first on Engineering.com.

]]>
How to plan data collection, storage and visualization in an IIoT deployment https://www.engineering.com/how-to-plan-data-collection-storage-and-visualization-in-an-iiot-deployment/ Mon, 21 Oct 2024 19:32:23 +0000 https://www.engineering.com/?p=133070 Be sure to consider scalability and future-proofing to accommodate evolving manufacturing processes and technologies.

The post How to plan data collection, storage and visualization in an IIoT deployment appeared first on Engineering.com.

]]>

When it comes to an IIoT (Industrial Internet of Things) implementation in manufacturing, data collection, storage, analytics and visualization are the core backplane that drives actionable insights and enables smarter operations.

How do these components typically align in an IIoT system and what considerations should a manufacturing engineer should keep in mind when planning an implementation? It can certainly get complicated, but breaking things down into their smaller parts makes it more manageable.

Data Collection

The effectiveness of data collection largely depends on sensor architecture. Depending on the equipment or process, various types of sensors (temperature, pressure, vibration, etc.) need to be deployed across critical points in the manufacturing process. Ensure sensors are selected with appropriate accuracy, environmental tolerance and response time for the specific application.

A Data Acquisition Systems (DAS) act as an interface between these sensors and the IIoT platform. It gathers real-time data from sensors and transmits it to the edge or cloud infrastructure. The big decision here is whether to use edge processing (local data pre-processing) or rely on centralized data gathering at the cloud level. Edge processing offers lower latency, making it ideal for real-time tasks. It also reduces bandwidth needs by processing data locally. However, it requires more upfront investment in hardware and can be harder to scale. In contrast, cloud processing handles large data volumes more easily and scales better, though it comes with higher latency and ongoing costs for bandwidth and storage. Cloud systems also need robust security measures for data transmission. A hybrid approach combining both edge and cloud processing might be an option that balances real-time processing with scalable, centralized data management, but it depends on each application and the desired outcomes.

The next big decision is to determine the optimal sampling rate. Too high of a sampling frequency can overwhelm your storage and bandwidth, while too low may miss critical insights, particularly in dynamic manufacturing processes. Work with process engineers to determine the data sampling frequency based on process variability. Test this often to ensure what you think is the optimal sampling rate isn’t leaking potential value.

If you are going to base major decision off the insights gained through this IIoT system, you must ensure the integrity of collected data. This means that error checking (e.g., using checksums or hashing) and redundancy mechanisms (e.g., backup data paths or local buffering) are in place to handle network failures or sensor malfunctions.

A checksum is a small-sized piece of data derived from a larger set of data, typically used to verify the integrity of that data. It acts as a digital fingerprint, created by applying a mathematical algorithm to the original data. When the data is transmitted or stored, the checksum is recalculated at the destination and compared with the original checksum to ensure that the data has not been altered, corrupted or tampered with during transmission or storage.

Hashing is the process of converting input data into a fixed-size string of characters, typically a unique value (hash), using a mathematical algorithm. This hash is used for verifying data integrity, securing communication, and enabling fast data retrieval, with each unique input producing a unique hash.

When planning sensor deployment, focus on critical assets and key process variables that directly impact production efficiency, quality or safety. Implementing a hierarchical sensor strategy (high-priority sensors collecting frequent data, lower-priority ones providing long-term insights) can help balance costs and data richness.

Data Storage

Here again you are faced with a decision between either local (edge) or a centralized cloud environment for data storage. The same the same pros and cons apply as did in data acquisition, but your needs may be different.

Edge storage is useful for real-time, low-latency processing, especially in critical operations where immediate decision-making is necessary. It also reduces the amount of data that needs to be transmitted to the cloud.

Cloud storage is scalable and ideal for long-term storage, cross-site access and aggregation of data from multiple locations. However, the bandwidth required for real-time data streaming to the cloud can be costly, especially in large-scale manufacturing operations.

Manufacturing environments typically generate large volumes of data due to high-frequency sensors. Plan for data compression and aggregation techniques at the edge to minimize storage overhead.

Lossless compression reduces data size without any loss of information, ideal for critical data. Popular algorithms include GZIP, effective for text data, LZ4, which is fast and low latency for real-time systems, and Zstandard (Zstd), offering high compression and quick decompression for IIoT.

Lossy compression, on the other hand, is suitable for sensor data where some precision loss is acceptable in exchange for better compression. Wavelet compression is efficient for time-series data, and JPEG/MJPEG is often used for images or video streams, reducing size while maintaining most visual information.

Data aggregation techniques help reduce data volume by combining or filtering information before transmission. Summarization involves averaging or finding min/max values over a time period. Sliding window aggregation and time bucketing group data into time intervals, reducing granularity. Event-driven aggregation sends data only when conditions are met, while threshold-based sampling and change-detection algorithms send data only when significant changes occur. Edge-based filtering and preprocessing ensure only relevant data is transmitted, and spatial and temporal aggregation combines data from multiple sources to reduce payload size.

Because edge devices often operate in resource-constrained environments, deal with real-time data and must efficiently manage the communication between local systems and central servers, there are several edge-specific considerations for optimizing data management in IIoT systems. For real-time applications, techniques like streaming compression (e.g., LZ4) and windowed aggregation help minimize latency by processing data locally. Delta encoding reduces data size by only transmitting changes from previous values, minimizing redundancy. Additionally, hierarchical aggregation allows data to be aggregated at intermediate nodes, such as gateways, before being sent to the central system, further reducing the transmission load and improving overall efficiency in multi-layered edge networks. These considerations are uniquely suited to edge computing because edge devices need to be efficient, autonomous, and responsive without relying heavily on central systems or expensive bandwidth.

You’ll also need a storage architecture that can scale to accommodate both current and future data growth. Also, implement a robust redundancy and backup strategy. With critical manufacturing data, losing information due to hardware failure or network issues can be costly. Redundant storage, preferably in different geographic locations (for disaster recovery), is crucial for resilience.

TIP: For time-sensitive data (e.g., real-time process control), store at the edge and use data batching for non-urgent data that can be transmitted to the cloud periodically, reducing latency and network costs.

Analytics

Real-time analytics is essential for immediate decision-making (shutting down a faulty machine or adjusting a process parameter), while historical analytics provides long-term insights into trends and performance (predictive maintenance, yield optimization).

To enable real-time analytics, data should undergo initial pre-processing and filtering at the edge, so that only relevant insights or alerts are passed to the cloud or central system. This reduces data transfer overhead and minimizes latency in decision-making. For long-term analysis (identifying trends, root cause analysis), use batch processing techniques to handle large datasets over time. Machine learning (ML) and AI models are increasingly integrated into IIoT systems to identify anomalies, predict failures or optimize operations based on historical data.

IIoT analytics is more than just looking at individual sensor data; it’s about correlating data across multiple devices, sensors and even different factory lines to uncover patterns. Implement data fusion techniques where data from different sensors or sources can be combined to improve the accuracy and richness of insights.

Visualization

Visualization tools are essential for both operators and decision-makers to quickly assess the performance of processes and machines. These should include customizable dashboards that display real-time Key performance indicators (KPIs) like throughput, efficiency, downtime and machine health. KPIs should be linked to the specific objectives of the manufacturing process.

For process optimization and long-term planning, historical trends and patterns should be visualized clearly. This allows for root-cause analysis, identifying inefficiencies and making data-driven decisions about process improvements.

These visualizations should be tailored to different user roles. Operators need real-time alerts and immediate insights into machine performance, while managers or engineers might need access to historical data and trend analysis. Design the user interface (UI) and access controls with these distinctions in mind.

For advanced implementations, digital twins and augmented reality can be used to simulate and visualize complex data in 3D. Digital twins create a virtual replica of the manufacturing environment, allowing engineers to monitor and optimize operations without needing to be physically present.

Planning IIoT implementations

When planning IIoT in manufacturing, focus on building a scalable, resilient and secure architecture for data collection, storage, analytics and visualization. Ensure that data collection is optimized to balance cost and data richness, using both edge and cloud storage appropriately. Analytics capabilities should provide real-time decision support while enabling deep insights through predictive maintenance and long-term performance analysis. Visualization tools should cater to different user needs, ensuring clear, actionable insights through both real-time dashboards and historical data views. Keep in mind the challenges of data volume, latency, network bandwidth and data integrity as you design the IIoT system, with attention to scalability and future-proofing the infrastructure to accommodate evolving manufacturing processes and t

The post How to plan data collection, storage and visualization in an IIoT deployment appeared first on Engineering.com.

]]>
BAE’s Falconworks R&D division aims to transform aerospace engineering https://www.engineering.com/baes-falconworks-rd-division-aims-to-transform-aerospace-engineering/ Wed, 16 Oct 2024 20:37:49 +0000 https://www.engineering.com/?p=132945 Siemens and BAE Systems partner in a massive digitalization effort in its aerospace manufacturing and engineering operations.

The post BAE’s Falconworks R&D division aims to transform aerospace engineering appeared first on Engineering.com.

]]>
Factory of the Future’ technologies involve integrating advanced digital tools like IoT, AI, and automation to create efficient, flexible, and intelligent manufacturing processes. (Image: BAE Systems)

At the Farnborough Airshow in July 2024, Siemens and BAE Systems announced a five-year collaboration to accelerate digital innovation in engineering and manufacturing. Using Siemens’ Xcelerator platform, this partnership seeks to transform processes within BAE Systems’ Air sector through FalconWorks, its Research and Development (R&D) division. The R&D center fosters an open innovation ecosystem, connecting suppliers, SMEs, governments, research organizations, and academia to “accelerate the innovation of future air power through the development of technology and capabilities.” It unites approximatively 2,000 experts across 11 sites in the UK.

This agreement builds on a longstanding relationship, deploying Siemens’ advanced digital software, such as NX and Teamcenter to enhance sustainability, industrial digitalization, and supply chain modernization. Leaders from both companies emphasized the collaboration’s potential to drive Industry 4.0 advancements and achieve significant digital transformation in aerospace manufacturing. Iain Minton, BAE Systems’ Technology Capability Delivery Director, noted, “Siemens understands the complexities of our operating environment, so we can very quickly mature an idea to the point where it is put into practice, for example when we are looking to implement and optimize new engineering, support, or manufacturing capabilities.”

A digital engineering ecosystem for open innovation

BAE Systems’ FalconWorks is not only looking at solving today’s challenges; “it is the agile innovation powerhouse driven by […] technology teams that will develop the game-changing technologies of the future.” Simply put, it focuses on scanning the technology horizon to identify and develop groundbreaking building blocks of the future in the Aerospace and Defense sector. Maintaining an edge in such competitive landscape implies developing industry standards, working with regulators to ensure these are acceptable to the society from a safety and sustainability perspective, while focusing on effective routes to market for successful commercialization.

Fostering an open innovation ecosystem, the company embarked on a multi-year strategic investment in Digital Engineering (DE) to digitalize its systems engineering and integration capabilities, “investing in digital infrastructure and virtual, collaborative Digital Engineering Capabilities Labs (DECL) to drive rapid innovation, state-of-the-art digital technologies, and cloud migration.” This includes collaboration with SMEs, academia, legislators, and industry leaders, along with co-funding start-ups to develop new technologies.

Per a 2020 whitepaper, BAE systems elaborated on its Advanced Integrated Data Environment for Agile Manufacturing, Integration and Sustainment Excellence (ADAMS) reference architecture to fulfil this vision: “this digital enterprise is built on a model-based, integrated development or data environment that supports multi-disciplinary, multi-organization stakeholders and leverages product-line reference architectures and a shared model library to develop, deliver, and sustain a system through its lifecycle.” Clearly, the digital ecosystem is only an enabler, part of a data layer foundational to drive process and product innovation.

Aerospace digital twins and data management

PLM serves as the backbone, integrating technologies, data, and processes to ensure seamless information flow across business functions and the entire product lifecycle—from concept and design to manufacturing, maintenance, and recycling. PLM processes require connected data flows across the manufacturing and extended enterprise ecosystem. Through integration and workflow automation, all product data, from design to production, must be digitized and interconnected, facilitating seamless communication between systems, machines, and teams. Such integration allows for real-time monitoring, data-driven decision-making, and automation, ensuring that the factory operates efficiently and can quickly adapt to changes in demand or production requirements.

Additionally, PLM supports continuous improvement by enabling feedback loops from the factory floor back to design and engineering, leading to optimized processes and product quality. For instance, this includes the implementation of advanced manufacturing techniques, such as additive manufacturing, 3D printing, and automated assembly, connecting CAD and software data with production processes by ensuring that all design and manufacturing data are centrally managed and accessible. In the context of BAE’s vision, PLM can facilitate the integration of Digital Twins, virtual representations to allow real-time monitoring and optimization of manufacturing processes—ensuring that the factory can respond dynamically to changes and demands. Aerospace Digital Twins are crucial for driving Industry 4.0 by enhancing efficiency, reducing costs, driving quality adherence, compliance, and sustainability. The top five Digital Twins essential for this purpose include:

  1. Product Digital Twins: Represent physical aircraft or components throughout their lifecycle, enabling real-time monitoring, predictive maintenance, and performance optimization to reduce downtime and extend asset lifespan.
  2. Process Digital Twins: Model and optimize manufacturing and assembly processes, allowing for quick identification of inefficiencies, waste reduction, and overall production quality improvement.
  3. Supply Chain Digital Twins: Provide a real-time, end-to-end view of the supply chain, managing disruptions, optimizing logistics, and ensuring timely delivery of components.
  4. Operational Digital Twins: Monitor in-service aircraft and systems, enabling optimization of flight paths, fuel consumption, and maintenance schedules for better performance and reduced costs.
  5. Human Digital Twins: Simulate interactions between humans and machines, optimizing human factors, enhancing training, and improving safety by modeling human responses to various scenarios.

Connected, sustainable asset optimization

A connected intelligent factory is a data-driven manufacturing environment that uses advanced automation, real-time analytics, and interconnected systems to optimize aerospace component production, assembly, and maintenance. The Aerospace industry strives to balance cutting edge innovations to foster competitive advantage with through-life optimization of complex assets to effectively capitalize long-lifecycle products. Asset compliance traceability and throughout monitoring is essential to enable Aerospace and Defense, and other heavy regulated operations, supporting new business models—from product development to full in-service operations management.

To that effect, BAE Systems’ Digital Intelligence division acquired Eurostep in 2023 to accelerate the development of its digital asset management suite, PropheSEA™, a platform to “consolidate and share […] complex asset data securely, allowing assets to be managed proactively, reducing operating costs and maximizing asset availability.” Mattias Johansson, Eurostep CEO, highlighted that “Eurostep has collaborated with BAE Systems for many years with […] ShareAspace sitting at the heart of Digital Intelligence’s Digital Asset Management product suite [to help organizations] securely collaborate across the supply chain and cost effectively manage their assets through life.” Regulators also require through-life carbon footprint measurement, which can be difficult to forecast with products whose asset life can span 40 to 50 years.

As presented in one of the ACE conferences championed by Aras in 2016, Kally Hagstrom, then Manager of Information Systems with BAE Systems, explained why complex long-lifecycle products require a PLM strategy that enables high-level of resiliency. BAE Systems then initiated the implementation of Aras Innovator alongside its legacy Teamcenter platform to consolidate several PLM business capabilities, from requirements to change management, systems engineering, supplier collaboration, process planning and MBOM management, document and project, as well as obsolescence management. Clearly, based on the recent Siemens partnership extension, the legacy Teamcenter environment is also there today at BAE Systems, regaining ground in the maintenance, repair and overhaul (MRO) space and/or expanding further into downstream manufacturing digitalization. Furthermore, it would be interesting to hear if/how BAE Systems is possibly driving the coexistence of multiple PLM platforms in its DE ecosystem to drive open innovation and manufacturing, possibly leveraging its 2023 investment in Eurostep.

To paint the full picture, it would be necessary to dig more into how BAE Systems collaborate with its supply chains and manage its intellectual property. This would also comprise a broader understanding of how the OEM connects the dots across its PLM, ERP and MES landscape to drive a truly end-to-end digital and data connected landscape. By enabling sustainable design and efficient resource management, integrated PLM can help reduce the environmental impact of aerospace manufacturing. This aligns with BAE’s broader goals of innovation and sustainability, ensuring that BAE Systems’ Factory of the Future is both technologically advanced and environmentally responsible.

The post BAE’s Falconworks R&D division aims to transform aerospace engineering appeared first on Engineering.com.

]]>
Optimize Part Procurement & Generate Savings https://www.engineering.com/resources/optimize-part-procurement-generate-savings/ Mon, 14 Oct 2024 18:06:55 +0000 https://www.engineering.com/?post_type=resources&p=133431 Turn your existing process into data-driven collaborative sourcing to decrease complexities and costs and minimize compromises between program margin and time-to-market.

The post Optimize Part Procurement & Generate Savings appeared first on Engineering.com.

]]>

Reorganizations, mergers, and innovation are opportunities to create new product parts, thus increasing costs to design, manufacture, test, source, and store the parts.

In addition, as a result of global disruptions, manufacturers report that supplier costs are rising, deliveries are delayed and suppliers are less reliable/predictable. This can drain revenue and profits.

Enhancing collaboration between Engineering and Procurement can help companies address these challenges.

Watch our webinar to see how NETVIBES solutions powered by data science and AI generate part procurement savings.

  • Uncover the synergies between Engineering and Sourcing with the Dassault Systèmes 3DEXPERIENCE platform.
  • Learn how to streamline your new component sourcing by leveraging all of your data with 3D and artificial intelligence technologies.

 

This on-demand webinar is sponsored by Dassault Systèmes.

The post Optimize Part Procurement & Generate Savings appeared first on Engineering.com.

]]>
The continuous transformation equation https://www.engineering.com/the-continuous-transformation-equation/ Tue, 08 Oct 2024 17:38:27 +0000 https://www.engineering.com/?p=132625 When does a continuous improvement mindset kick in to avoid digital transformation fatigue?

The post The continuous transformation equation appeared first on Engineering.com.

]]>
Digital transformation is not just about implementing new technologies; it is a continuous journey that requires ongoing adaptation and evolution. This process reshapes business models, customer interactions, and operations, demanding organizations to stay flexible and proactive. To navigate this journey successfully, companies must find a balance between driving significant changes and allowing time for stabilization, adoption, and continuous improvement.

As organizations embark on their transformation journey, they face barriers such as technical debt, change fatigue, trade-offs between business and IT imperatives, and various challenges related to costly PLM and ERP implementations. Technical debt from outdated systems and processes can impede progress, while change fatigue can overwhelm teams, making it crucial to manage the pace of transformation effectively. Additionally, the high implementation costs, integration and upgrade complexities of new enterprise platforms can strain resources, suppliers and partners. By addressing these challenges, organizations can set achievable and tangible ambitions aligned with their long-term goals. Focusing on clear objectives and incremental improvements enables companies to modernize operations and foster a culture of continuous improvement, ultimately empowering them to innovate effectively and achieve sustainable growth.

The continuous nature of digital transformation

Digital transformation is an ongoing process—and perhaps even a “state of mind”—that continuously evolves how businesses operate and deliver value. It is not a one-time initiative but an infinite project that requires organizations to be in a state of perpetual readiness for change. This involves reshaping everything from business models to customer interactions. For example, moving from a traditional retail model to an e-commerce platform involves a complete overhaul of operations, including inventory management, logistics, and customer service. This constant state of evolution can be challenging, but it is necessary to stay competitive in a world where customer expectations and technologies are always shifting.

To succeed in this environment, organizations must accept that transformation does not have a single fixed endpoint, considering the following key questions:

  • Are we prepared for continuous adaptation? How well-equipped is our organization to respond to ongoing changes in technology, market conditions, and customer needs?
  • What does our desired future state look like? Have we defined a clear vision of where we want to be in the next 3-5 years, and how will this transformation help us get there?
  • How do we prioritize transformation efforts? Which areas of our business are most critical to transform first to achieve our long-term goals?
  • What capabilities do we need to develop or acquire? Do we have the necessary skills, technologies, and processes to support continuous transformation?
  • How do we measure the success of our transformation? What metrics and KPIs will help us track progress and ensure that our transformation efforts are delivering the expected value?

Reflecting on these questions can help organizations set a clear direction for their transformation journey and ensure that they are prepared to navigate the complexities and challenges that come with continuous evolution. It is a continuous journey where the ability to adapt to new technologies, changing market conditions, and evolving customer needs is essential. This ongoing evolution requires a mindset shift from seeing transformation as a project with a start and finish to viewing it as an integral part of the business strategy.

Balancing change with stabilization and adoption

While ongoing transformation is crucial for staying competitive and embracing technological advancements, it’s equally important to balance it with periods of stabilization. Continuous change without the opportunity to solidify and adopt new processes can lead to chaos and overwhelm teams. Successful organizations alternate between phases of significant change and stabilization to ensure that new processes are fully integrated, and employees are not stretched too thin.

During periods of change, the focus is on implementing major initiatives such as adopting new technology platforms, restructuring business processes, or launching new products. These are high-impact changes that often disrupt the status quo. However, after these changes are implemented, it is essential to allow time for stabilization. This phase involves integrating the new processes into daily operations, training employees, and refining workflows. By allowing time for adoption, organizations ensure that the changes deliver their intended benefits and do not negatively impact performance.

Finding this balance is key to maintaining momentum without overwhelming the organization. A clear, flexible roadmap that outlines when to drive change and when to stabilize can help manage expectations and ensure that teams are not overburdened. This approach allows organizations to make impactful changes while ensuring these changes are fully adopted and optimized before moving on to the next big initiative.

Integrating continuous improvement with transformation

Continuous improvement is about making small, incremental enhancements to refine and optimize existing processes. It complements digital transformation by focusing on optimizing what is already in place. When integrated into the broader transformation strategy, continuous improvement helps ensure that organizations are not only making big changes but also continuously evolving and improving.

During stabilization phases, continuous improvement can focus on fine-tuning new processes and systems, ensuring they deliver maximum value and efficiency. For example, after implementing a new technology platform, continuous improvement efforts can identify and resolve any issues, streamline workflows, and enhance user training. This helps solidify transformational changes and prepares the organization for the next phase of evolution.

Organizations can also leverage continuous improvement to prepare for future transformation. By encouraging a culture where employees are constantly looking for ways to improve, businesses can identify areas ripe for transformation and make smaller changes that pave the way for larger initiatives. This creates a virtuous cycle where transformation and continuous improvement feed into each other, driving sustained innovation and growth. Successfully navigating the continuous transformation journey requires strategic planning and thoughtful prioritization. Organizations should consider the following key questions to effectively balance change and stabilization while driving long-term success:

  • What are our long-term strategic goals? Understanding the end goals helps prioritize transformation initiatives and identify areas where continuous improvement can drive the most value.
  • How do we measure the impact of transformation and improvement efforts? Establishing clear metrics and KPIs for both transformation and continuous improvement is essential for tracking progress and aligning efforts with business objectives.
  • How will we support our teams during periods of change? Effective change management, including clear communication, training, and ongoing support, is crucial for minimizing resistance and ensuring successful adoption.
  • What is the right balance between change and stabilization for our organization? Assessing the organization’s capacity for change helps determine how much transformation can be absorbed at once without overwhelming teams and disrupting operations.
  • How do we maintain momentum without overwhelming the organization? Setting realistic timelines and milestones for transformation and improvement initiatives can help keep the organization focused and energized without leading to burnout.

Building resilience and adaptability: embracing continuous transformation

The ability to continuously transform while maintaining operational stability is a key factor in long-term success. Organizations that embrace change as a constant and foster a culture of adaptability are more resilient and better equipped to handle disruptions and seize new opportunities. Preparing for the unexpected means having systems, processes, and a mindset in place that allow for quick adaptation to new challenges.

A culture of adaptability and responsiveness to change enables learning organizations to see transformation as an opportunity rather than a threat. When employees are empowered to contribute to continuous improvement and are supported through periods of change, they are more likely to embrace new ways of working. This resilience is essential for navigating the complexities of the transformation journey and achieving sustainable success.

Digital transformation is not a destination but an ongoing journey that requires constant adaptation and improvement. By balancing periods of change with stabilization and integrating continuous improvement into the transformation strategy, organizations can navigate this journey more effectively. Embracing such cyclic evolution and fostering a culture of adaptability enables businesses to not only manage disruption but also thrive amid continuous change. Organizations that approach digital transformation as a continuous journey—rather than a series of disconnected projects—position themselves for sustained innovation, growth, and long-term success.

The post The continuous transformation equation appeared first on Engineering.com.

]]>
How to fight technological inertia to make key improvements https://www.engineering.com/how-to-fight-technological-inertia-to-make-key-improvements/ Tue, 08 Oct 2024 14:33:16 +0000 https://www.engineering.com/?p=132601 This CMM use case demonstrates how overcoming this inertia is essential to staying competitive.

The post How to fight technological inertia to make key improvements appeared first on Engineering.com.

]]>


Using a portable measurement arm (Image: Frontier Metrology)

Continuous improvement is the name of the game in manufacturing, paring away inefficiencies and making projects more profitable–something management gets very excited about. However, while some improvements can be made with small investments, such as kanban cards or kaizen bins, other process improvements require more significant investments. Bringing in new technology costs hundreds of thousands to millions of dollars. And the initial outlay for a piece of equipment is just the beginning. Training, upskilling and shop space may be associated costs. When it comes time to open the checkbook, some manufacturing decision-makers start singing a different tune: why do we need these improvements again? What’s wrong with the old process?

When it’s time to change, manufacturers may face technological inertia, the phenomenon by which the accumulated knowledge and experience with one technology exerts a pressure against effectively using a new technology. One example of this is the portable coordinate measuring machine, or CMM arm, which can drastically reduce inspection times and drive immediate production quality improvements.

So how can manufacturing managers successfully navigate an opportunity to invest in new technology to improve a process, without getting stuck in the inertia of the known?

What is technological inertia?

One example of technological inertia might take place at a busy doctor’s office considering a transition from x-ray to CT scans: it’s more complicated than just buying the machine, because:

  • technicians know how to take x-rays and may require new training
  • patients are already familiar with what’s required for an x-ray
  • Doctors are more experienced in interpreting x-rays than CT scans.
  • The business can’t afford to see a ‘dip’ in quality of care during an adjustment period.

So, instead of investing in the CT scanner, the business may purchase more x-ray machines, to keep up with increasing demand without having to adopt the new generation of technology. In this way, the inertia of the old technology inhibits any improvements the new technology could bring.

Overcoming this inertia to realize the benefits of a new technology is essential for companies to stay competitive and drive profits.

To learn more about how manufacturing decision makers can overcome this inertia and find their way to the other side, engineering.com spoke to two engineering and metrology professionals with experience not only with portable CMM arms, but also the technology and processes they replace, such as manual measurement techniques like plate layout, verniers, and gauge blocks; but also the venerable bridge CMM in the climate-controlled quality lab.

Alex Dunn is a Manufacturing Engineer and Measurement Specialist with ten years experience in the gas turbine manufacturing industry. He’s watched as portable measurement arms have developed from early models limited in accuracy, to today’s models which may rival the accuracy of the bridge CMM.

Fabrizio Beninati is the owner of Frontier Metrology, a metrology service provider and FARO distributor based in Ontario, Canada. Working with FARO and Polyworks, Fabrizio has seen the full gamut of metrology workflows, from the creative to the archaic, as he demos and promotes the capability of the portable FARO arm equipped with a laser line scanner.

Quality and Metrology in Manufacturing

Ever since the industrial revolution brought the innovation of interchangeable parts, measurement and quality control have been essential parts of manufacturing. Tolerances limiting the variation of part dimensions are set by engineers according to the requirements of the part, and quality control processes ensure that production matches those dimensions accurately, within set tolerances. The instruments that enable quality control include the humblest tape measure in the construction industry, to the ISO 10360 certified, gold-tipped Zeiss CMM with 0.3 µm accuracy, and everything in between.

A few common metrology solutions in manufacturing include:

  • Manual instruments such as micrometers, depth gauges and verniers
  • Granite surface plates used in conjunction with tools such as height gauges, gauge blocks, and dial indicators
  • Optical devices such as comparators, shadowgraphs, and profile projectors
  • A typical 3-axis bridge or gantry CMM, which can be programmed to take highly-accurate measurements using probe contact
  • Portable CMMs, which digitally read the joint positions of an arm to interpret the 3D position of the measurement device, such as a probe or digital optical system.
  • Digital optical systems, such as 3D scanners.
  • CAD software is an essential part of digital metrology solutions, as collected measurement data can be compared to the CAD model reference to determine and report on deviations.

For manufacturers used to one of these processes, inertia including personnel training, costs of new equipment, and lack of knowledge of the benefits and ROI of new alternatives make it difficult to implement new solutions, such as portable CMM.

Advantages and applications of Portable CMM

In the typical machine shop or fabrication shop, the headache of metrology is that manual tools such as verniers are fast but not sufficiently accurate or repeatable, while the bridge CMM is highly accurate–even surpassing many projects tolerance requirements–but too slow, especially for new projects, when it needs to be programmed. In Alex Dunn’s experience as a Measurement Specialist, while many of the arm CMMs on the market today can’t match the accuracy of a bridge CMM, they can typically hold tolerances above 1 thou. For applications with more relaxed tolerances, such as fabrication, the speed and flexibility of portable CMMs is unmatched. “You grab the arm, calibrate the probe once, you don’t have to worry about calibrating different angles, different styli. You just calibrate the probe and it’s good for the whole volume of measurement. You’re not concerned with crashing the machine. You’re not concerned with having to bring parts into the temperature controlled lab. You get to go to your parts,” explained Dunn. “The big power of a bridge CMM is automation. If you have a high volume of parts that need to be measured, you program the machine and you can train an operator to run the machine, get the data, and do what they need to do with the parts.” While this automation makes for a faster process at high volumes, it also comes with costs.

First, the CMM needs to be programmed, not unlike a CNC mill, to make the movements required to bring the probe to touch off at each measurement location. Secondly, a CMM is a very expensive piece of equipment. If the quill or the probe crash into the part, you’ve not only scrapped a part, but the CMM must now be calibrated or repaired. “If you don’t have high volume, the portable solution is much faster because you’re not programming a machine,” said Dunn. “You’re just simply operating the instrument physically, so you’re not going to be concerned with your clearance planes, or your stylus calibration. You can obviously damage your part if you hit it with the instrument, but the risk of collision is much less. You can very rapidly get the measurement you need and move the part on. So you gain a lot of speed in that regard.”

Dunn recommends the Hexagon Romer arm. “If you pair that with the [Hexagon] AS1 laser line scanner, which I was fortunate enough to have at a previous employer, you can tackle a lot of projects with that because you have the ability to do touch probing as well as scanning for large surfaces.”

This pain point of slow cycle time is why many customers call Fabrizio Beninati at Frontier Metrology to learn about arm CMM technology. For Beninati, understanding the advantages and applications of portable CMM systems is key to not only selling them as a distributor, but also using them himself as a metrology service provider.

Beninati finds that many manufacturers in the automotive industry are moving some inspection tasks to portable CMM solutions, finding conventional CMM too slow and cumbersome. “A lot of our clients are leaving the CMM behind, or dedicating the CMM to the high precision work and switching to the arm to do a higher volume of parts.” with the arm and laser line scanner in conjunction with polyworks, said Beninati, “customers are able to quickly scan and target what they need and generate a report in a fraction of the time.” Beninati recommends a FARO arm in conjunction with a laser line scanner and Polyworks software to quickly and easily capture measurement data and use frame of reference inspection to detect and measure deviation.

A part in Polyworks, showing the scanned data mesh (real part dimensions) compared to the CAD reference, with deviation callouts. (Image: Frontier Metrology)

“CMM is slow, it’s methodical, it’s delicate work,” said Beninati. “You don’t rush CMM because probes are expensive, the machines expensive, everything is very methodical in the CMM world. The arm is more freehanded and forgiving to capture data.”

Beninati echoed Dunn in the idea that while portable systems may not be able to measure as accurately as bridge CMMs today, they still find applications in precision manufacturing. “We have customers that use a bridge CMM for their initial process capability studies, and they say once we’re within our values now we’ll just inspect using the FARO arm because you can do tenfold more vs the CMM.” Both Dunn and Beninati anticipate the accuracy rating of portable systems creeping up in the future, to rival that of larger machines.

How to Overcome Technological Inertia and Improve Manufacturing Processes

According to Beninati, technological inertia is a major reason why manufacturers drag their feet or fail to upgrade to faster, more efficient processes, such as a portable CMM. When an aerospace manufacturing shop hired Beninati as a service provider to do inspection using the FARO arm, he pitched a portable system for the customer to buy, so that they could implement it in-house. “They told me they use shadowgraphs. They measure using a shadow and grid cells, counting them out manually,” he explained. “It works, and they say, ‘why improve it? Why get an arm?’ they can just do it by counting.”

Stay up to date

The leaders who successfully navigate a technology change are those who empower themselves and their teams with knowledge about new emerging technologies. Without an understanding of emerging technologies, it’s impossible to see how they may apply in your processes. Beninati highlighted trade shows and supplier demos as key tools for manufacturing leaders to help stay in the know.

“Find those couple trade shows that are on the cutting edge,” said Beninati. “That’s the vendors’ time to shine. Go to trade shows where the big players go, and see the new and emerging technologies. Stay open to possibilities.”

Dunn also highlighted demos as a key knowledge tool. “I always advise folks, if you’re interested in any of these instruments contact the suppliers because it’s their job to demo the equipment and bring application engineers to you,” he said. “You can put the product right in front of them and say, ‘show me how to measure it,’ And it’s their job to prove to you that they can. Then everyone wins.”

Upskill employees

Part of technological inertia is cultural. It’s natural for individuals, especially in their jobs, to resist and fear change and the instability it may bring. When a new technology reduces labor hours required for a process, workers may wonder if they will lose those hours. Communication is key to confronting this mindset and assuaging these fears. Managers can address these fears by highlighting the benefits of upskilling for employees.

“I’ve seen cases where workers want the arm because they see it as a new skill set, a new and emerging technology,” said Beninati. “But I’ve also seen quality teams who have seen it  as a threat to their jobs.” However, in his experience, customers that do get started with a portable CMM find applications for both the new instrument as well as their existing CMM, for example, leading to more opportunities for metrology personnel than before.

Plan funding strategically

If there is no budget, the benefits of a new technology don’t matter. Beninati has seen this firsthand as a supplier. When a customer assembling parts sourced from many suppliers began seeing quality variations, they implemented a manual metrology process using height gauges and blocks to measure each assembly by hand.  “They were taking up to 3 hours for each assembly,” recalled Beninati. “I went in there, did the demo, scanned it in like 3 minutes, and even had their guy–who had never touched a FARO arm–try it. It took him 8 minutes, so I said, ‘How many can I put you down for?’ And we all laughed, but today it’s waiting on upper management.” Even though Beninati and the FARO arm demonstrated crystal clear ROI, cost stood in the way.

“If you’re an OEM supplier or plant and you secure that big contract, you’ve already put in your budget the building expansion, the new tools, new equipment. So in for a penny, in for a pound. You didn’t put in that FARO arm because you’re going to do it with verniers and calipers, you’re committed until the next job comes around.”

In Dunn’s experience, that’s the best time to propose technological change: when new money comes in along with a new contract.

“Usually when your processes are locked in place and you’re used to a certain amount of revenue from your parts, it’s really really hard to go to management and say, ‘we need you to eat into those profits so that I can have this piece of equipment.’ They’re going to say, ‘why? I’m making this money, and you’re telling me I’ll make less money to deliver the same product?’ That’s a really hard thing to sell,” he explained. When budgeting new money to different project needs, it’s easier for employees to propose new technology using the funds earmarked for the existing process. “Let’s say $40,000 is all it takes to get a standard probing package arm,” said Dunn. “You can buy 40,000 worth of gauges no problem. So at that point, you can approach management and say hey, here’s a solution that will work for this project, the money is set aside for it, and we can then use the piece of equipment to improve other processes in future.”

Listen to employee experience

Lastly, in addition to suppliers and trade shows, your employees have knowledge of new technologies and alternative ways of doing processes that you can unlock. As experienced machinists, engineers and workers move around the industry, they carry knowledge of how the industry is moving forward, and they bring this knowledge to your company.

“Who is more likely to be aware  and plugged into these technologies? It’s gonna be your shop floor guys,” said Dunn. “a lot of employees arrive at your shop with experience with other equipment, and bring that knowledge and advice. That opens doors and minds to new alternatives. Management can’t spread themselves so thin as to become as intimately familiar with the process as well as a machinist or fabricator. So, good managers listen to their employees.”

Where will your next process improvement take you?

While technological inertia has sunk many ships across industries and sectors, the keys to navigating it come down to basic, effective management practices:

  • Stay up to date in your industry
  • Leverage the experience of your employees
  • Understand the ROI
  • Don’t fear change

Armed with these principles, you’ll be ready to ride the next technological wave and stay ahead of the competition.

The post How to fight technological inertia to make key improvements appeared first on Engineering.com.

]]>
What are the roles of sensors and actuators in IIoT? https://www.engineering.com/what-are-the-roles-of-sensors-and-actuators-in-iiot/ Mon, 07 Oct 2024 19:48:25 +0000 https://www.engineering.com/?p=132533 Sensors are the eyes and ears of your operation and actuators are the hands.

The post What are the roles of sensors and actuators in IIoT? appeared first on Engineering.com.

]]>

Every manufacturing engineer considering an IIoT implementation should put considerable focus into how the systems contribute to data collection, real-time decision-making and automated control within the production environment.

Sensors are the eyes and ears of your operation. These data collection devices continuously monitor various physical or environmental parameters on the shop floor. Sensors have been developed to measure almost any condition on the shop floor. Here are some common types:

Temperature (for controlling furnaces or ovens)

Pressure (for monitoring hydraulic or pneumatic systems)

Vibration (for detecting imbalance in motors or machinery)

Humidity (for ensuring optimal conditions in certain manufacturing processes)

Proximity (for part detection on a conveyor belt or pallet)

Torque and Force (for ensuring precise assembly or machining)

These days, most sensors provide real-time data that are essential for understanding the status of machines, the health of equipment and the quality of products.

Sensors can capture data continuously or at regular intervals, feeding it back to a centralized system or edge devices. This data allows you to monitor machine performance and production quality in real-time. By continuously monitoring conditions such as temperature, vibration and pressure, sensors can help predict equipment failures before they happen—enabling predictive maintenance strategies. This minimizes downtime and unplanned repairs. Sensors can also ensure product quality by tracking parameters such as size, weight or chemical composition, ensuring products are within acceptable tolerances.

The data collected by sensors is sent to centralized cloud systems or edge devices for real-time analysis, enabling manufacturers to make informed decisions on production adjustments and process improvements.

Actuators: The Hands of Your IIoT System

Once sensors collect and transmit data, actuators play the critical role of executing actions based on the data received. Actuators are devices that respond to control signals by performing physical tasks, including:

Opening or closing a valve (to control fluid or gas flow in a pipeline)

Adjusting motor speeds (for conveyor belts or robotic arms)

Turning machines on or off (for automated start/stop of equipment)

Controlling temperature (by activating heating or cooling systems)

Moving robotic arms or equipment (for assembly, material handling or other precision tasks)

In an IIoT system, actuators are responsible for automating responses to specific conditions detected by sensors. This creates the foundation for closed-loop control systems that can operate independently of human intervention. For example, if a temperature sensor detects overheating, the actuator could activate a cooling system without manual intervention. This automation reduces human labor and the chances of errors or inefficiencies in production. It also speeds up response times to deviations, minimizing waste and downtime.

Actuators can also adjust machine settings dynamically. For example, based on real-time data, they can modify the speed or pressure of a machine, ensuring the production process adapts to the changing needs of the workflow.

In more advanced IIoT setups, edge computing and AI-driven algorithms use sensor data to make autonomous decisions, triggering actuators without human oversight. This could be as simple as adjusting a process or as complex as rerouting products based on real-time data streams.

Working together in IIoT

In a typical IIoT system, the interaction between sensors and actuators follows a continuous cycle of data collection and response, which is often referred to as closed-loop control. Here’s an example:

Sensors detect changes: A temperature sensor detects that the temperature in a furnace is rising above the set threshold.

Data is sent: The sensor transmits this information to the controller (either an edge device or cloud platform) in real-time.

Data is analyzed: The controller analyzes the data and determines that corrective action is needed (e.g., the furnace is overheating).

Actuator takes action: Based on the analysis, the controller sends a signal to an actuator that opens a valve to release cooling air or turns on a cooling system.

Process adjustment: The actuator performs the task, and the sensor continues to monitor the process, feeding back data to ensure the temperature returns to safe levels.

Benefits of sensors and actuators in manufacturing

Increased Production Efficiency:

Sensors and actuators enable real-time adjustments to processes, ensuring that machines operate within optimal parameters. This minimizes downtime and keeps production flowing smoothly.

Enhanced Predictive Maintenance:

Continuous data from sensors allows for early detection of wear and tear or impending failures, reducing the need for reactive maintenance and minimizing unexpected breakdowns. Actuators can automatically adjust processes to prevent equipment damage.

Improved Quality Control:

Sensors track key quality metrics, and actuators can adjust the process instantly to ensure product quality remains consistent, reducing waste and scrap.

Operational Flexibility:

Sensors and actuators provide greater control over manufacturing systems, enabling them to respond flexibly to changes in production schedules, environmental factors, or even supply chain disruptions.

Cost Reduction:

Automation through sensors and actuators can lower labor costs and reduce human error. Moreover, optimized processes lead to less material waste, contributing to overall cost savings.

Data-Driven Decision Making:

By integrating sensors and actuators with a central data system (cloud or edge-based), manufacturers can leverage real-time analytics to gain actionable insights and make informed decisions to improve efficiency and productivity.

Common challenges

Let’s face it, maintaining a network of sensors and actuators and similar technology in a manufacturing environment can be tricky. Many environmental and workflow factors can result in degraded performance, even if they aren’t integrated into a broader IIoT implementation.

However, in IIoT manufacturing systems, several challenges are directly related to the integration of sensors and actuators into the broader industrial network. One key issue is communication latency and bandwidth limitations. IIoT systems rely heavily on real-time data transfer between sensors, actuators and control systems. Latency or insufficient bandwidth can delay data transmission or actuator responses, which is particularly troublesome in time-sensitive applications where quick reactions are essential.

Another challenge is connectivity and reliability issues. Since IIoT systems often involve wireless communication (e.g., Wi-Fi, LPWAN, or other IoT protocols), connectivity problems like signal dropouts, weak coverage or protocol incompatibility can disrupt the flow of critical data. In a networked environment, these disruptions can lead to missed sensor readings or commands not reaching actuators, causing downtime or unsafe conditions.

The sheer volume of data generated by IIoT devices can also lead to data overload and management challenges. With sensors constantly transmitting data, storage and processing systems can quickly become overwhelmed, making it difficult to extract actionable insights or react quickly to system needs. This can hinder operational efficiency, slow decision-making, and complicate data analysis.

Security vulnerabilities are another significant concern in IIoT systems. As sensors and actuators become more interconnected, they are exposed to potential cyber threats. Hackers could access the network to manipulate sensor data or control actuators, posing serious risks to both data integrity and physical safety.

Lastly, sensor and actuator compatibility can be an issue when integrating devices from different manufacturers or upgrading legacy systems. IIoT environments require seamless communication between different components, and incompatible sensors, actuators or communication protocols can lead to integration problems, system inefficiencies or even failures in real-time operations.

To address these challenges, best practices include using real-time networking protocols, implementing strong cybersecurity measures, employing edge computing to process data closer to the source, and ensuring that systems are compatible and interoperable across the IIoT network. These steps help ensure that the IIoT infrastructure operates reliably and efficiently.

The post What are the roles of sensors and actuators in IIoT? appeared first on Engineering.com.

]]>
What are the connectivity considerations in an IIoT implementation? https://www.engineering.com/what-are-the-connectivity-considerations-in-an-iiot-implementation/ Fri, 04 Oct 2024 15:18:53 +0000 https://www.engineering.com/?p=132475 Connectivity is the foundation of any Industrial Internet of Things (IIoT) implementation. For engineers, it’s not just about ensuring that devices and systems can talk to each other; it’s about choosing the right network architecture, protocols and security strategies to meet operational goals. In IIoT, connectivity refers to the ability of machines, sensors and control […]

The post What are the connectivity considerations in an IIoT implementation? appeared first on Engineering.com.

]]>

Connectivity is the foundation of any Industrial Internet of Things (IIoT) implementation. For engineers, it’s not just about ensuring that devices and systems can talk to each other; it’s about choosing the right network architecture, protocols and security strategies to meet operational goals.

In IIoT, connectivity refers to the ability of machines, sensors and control systems to communicate over networks. This enables real-time data exchange and interaction between devices, local networks, edge systems and centralized cloud platforms. In IIoT implementations, this connectivity is critical to enabling the flow of data needed for process optimization, predictive maintenance, remote monitoring and real-time decision-making.

IIoT devices can range from sensors to actuators to industrial machines. For devices to exchange data directly, you’ll typically use machine-to-machine (M2M) protocols. Engineers must ensure that these devices can communicate over low-latency and robust protocols that handle the real-time data flows characteristic of industrial environments.

Protocols like Modbus, OPC UA, and MQTT are industry standards used in IIoT for device-to-device communication. While Modbus, OPC UA, and MQTT are indeed the cornerstones of IIoT protocols, there are many other protocols to choose from depending on the application, environment and system requirements. Each protocol comes with its own set of strengths and weaknesses, so it’s important to assess performance, security, scalability and interoperability when selecting a protocol for your IIoT architecture.

Another consideration is protocol overhead, which is the extra information that communication protocols add to manage data transmission, handle security, ensure data integrity and support real-time operation. While necessary for reliable, secure communication, overhead can reduce bandwidth efficiency, increase latency and consume more power, which is especially problematic in IIoT environments. Understanding and managing protocol overhead is essential for optimizing performance and efficiency in IIoT implementations.

Edge connectivity

Edge devices (often called edge gateways or edge controllers) act as intermediaries between the industrial devices and the cloud. They handle preprocessing and data aggregation before sending relevant information upstream.

Implementing edge computing reduces latency, conserves bandwidth and allows for real-time decision-making at the device level. Edge architecture must be scalable and secure, often integrating with local databases or edge AI algorithms to run complex analytics.

Cloud connectivity and platform integration

IIoT relies heavily on cloud-based platforms for long-term data storage, aggregation, advanced analytics and remote monitoring. Cloud platforms offer scalable environments for handling data streams from devices in the field.

Ensuring reliable connectivity between edge nodes and the cloud is vital. Engineers should also focus on data integrity and network reliability, optimizing data protocols to reduce packet loss and latency.

Common protocols and data handling

MQTT is lightweight, supports real-time data and works well in low-bandwidth environments, making it ideal for IIoT where data volumes can be massive but not all data needs to be sent in real-time.

OPC UA is widely used in industrial settings for real-time data exchange between PLCs and other industrial automation equipment. It also supports security, which is a critical concern in industrial systems.

RESTful APIs or HTTP/HTTPS are more suitable for web-based interfaces or when integrating IIoT with existing enterprise IT systems but may not offer the real-time capabilities needed for certain mission-critical operations.

How to Address Connectivity Challenges

Industrial environments can be challenging for connectivity due to electromagnetic interference, harsh environments and network congestion. Implement redundant networks (dual Ethernet, cellular backup) for failover in case of primary network failures. Mesh networking in IIoT can increase reliability in environments with intermittent connectivity.

Engineers will often deal with scaling from dozens to thousands of devices over a large geographical area. To support this, it’s important to architect networks that can grow without compromising performance. This may involve local edge computing to handle localized data aggregation and minimize bandwidth requirements.

Security is paramount in IIoT, especially when sensitive operational data and critical infrastructure are involved. Use end-to-end encryption (TLS, AES) and secure communication protocols (like OPC UA with security features enabled). Additionally, ensuring device authentication, role-based access control and network segmentation can help protect against cyber threats.

Zero-trust architectures are becoming increasingly popular in industrial networks to ensure that no device or user is implicitly trusted.

Latency and bandwidth optimization

Low latency is crucial for time-sensitive operations, such as real-time control or automated responses in manufacturing. For example, 5G and LPWAN (Low Power Wide Area Networks, such as LoRaWAN) are being explored for IIoT because they offer low latency, high bandwidth and long-range communication capabilities.

You should also look at how data is being transmitted. Use data compression, aggregation and edge processing to reduce the volume of data being sent over the network.

Technologies enhancing IIoT connectivity

With the advent of 5G, IIoT is gaining a huge advantage in terms of bandwidth and low latency. 5G allows for high-density device support and real-time communication, ideal for applications like autonomous vehicles, smart grids and advanced robotics in factories.

For environments where power efficiency is crucial and devices are spread across large areas, such as farms, pipelines or smart cities, LPWAN protocols offer extended range and low power consumption with relatively low bandwidth needs.

Edge computing reduces the need to send every bit of data to the cloud, providing a more efficient means of processing high volumes of data locally. This can include real-time anomaly detection or local decision-making that reduces latency and bandwidth needs.

Best practices for IIoT implementation

In industrial settings, systems and machines from multiple manufacturers may need to communicate with each other. Ensure your connectivity infrastructure allows for interoperability through open standards (like OPC UA) and modular architectures that can easily integrate with third-party equipment.

Track all data flows and network performance with network monitoring tools and data governance frameworks. This will help in troubleshooting, performance tuning and meeting compliance standards.

Architect your IIoT system in a modular way so new devices or protocols can be integrated without requiring a full system redesign. This modularity supports future-proofing the system as new technologies emerge.

For engineers implementing IIoT, connectivity is a multi-faceted challenge that involves choosing the right protocols, designing reliable and secure networks, optimizing for scalability and latency and ensuring devices can communicate efficiently across systems. The foundation for a successful IIoT implementation lies in robust, scalable and secure connectivity, enabling real-time data flow, remote monitoring and proactive decision-making.

The post What are the connectivity considerations in an IIoT implementation? appeared first on Engineering.com.

]]>
Processing at the edge takes off https://www.engineering.com/processing-at-the-edge-takes-off/ Tue, 01 Oct 2024 19:57:50 +0000 https://www.engineering.com/?p=132351 Your data lives at the edge and determining how to connect processes can improve manufacturing and monitoring.

The post Processing at the edge takes off appeared first on Engineering.com.

]]>
The Nvidia IGX Orin platform (left) is used in healthcare, industrial inspection and robotics (from top to bottom, on
right). Source: Nvidia

Real-time and near real-time processing at the edge is more common than ever, thanks to improvements in chips and batteries. Yet a variety of logistical and technical problems present challenges for companies engaging in such processing. Fortunately, every instance of such work presents opportunities for these businesses to learn more from themselves and one another.

Implementing industry 4.0 practices in real-time and near real-time processing at the edge requires evaluating how current procedures can be improved. Beneficial changes enable companies to handle numerous scenarios that relate to interconnected procedures. For example, ensuring there is adequate security at the edge is best accomplished as a team goal between business partners. This goal can utilize two or more tools, such as encryption and two-factor authentication.

Recent changes that have increased the amount of real and near real-time processing at the edge include a current capability of up to 20 trillion operations per second (TOPS) for standard semiconductors, as opposed to a single TOPS a few years ago; faster speed and lower power consumption in different networks, from Long Range Wide Area Network (LoRaWAN) to 5G; and better software, including more Artificial Intelligence (AI) models, as well as new data sets and tools.

“The edge is where the data come from. Bringing the processing to companies working in these spaces is the goal. Such action can bring deployment time down by as much as a third, like from 18 months to six months. That presents cost savings and better opportunities to leverage AI,” says Pete Bernard, Executive Director of tinyML Foundation.

tinyML is a Seattle-based nonprofit that focuses on low power AI at the edge of the cloud. Its members include large corporations, including Qualcomm and Sony, academic institutions like Johns Hopkins University and nongovernmental organizations.

“tinyML holds frequent events to build community around the concept of the edge. We educate people about the potential of working at it. Our programs include contests, conferences, hackathons and workshops. One of the concepts we are considering now is data provenance,” says Bernard.

This idea relates to the watermarking of data sets and models. AI models must be trained on data sets. Stamping provenance helps users identify sources of data and the developers behind them. Such work makes it easier to integrate different data sets and models.

Software for the edge

Simplifying edge operations is easier to accomplish with software designed for that purpose, like Dell’s NativeEdge platform. 

Dell’s NativeEdge platform helps enterprises work with data generated at the edge. Source: Dell

“With NativeEdge, a client can build an AI model to operate at the edge. They can retrain the model onsite at the edge. This saves money and gives them the ability to scale up the solution as needed,” says Pierluca Chiodelli, Vice President, Edge Engineering and Product Management at Dell Technologies.

Dell sees security as the biggest challenge for clients.

A company that tries to do everything itself runs the risk of exposing information. Any entity that generates data must protect the data at the points where the data is created and stored.

Dell is enhancing security by working closely with NVIDIA, which developed the AI Enterprise software integrated with NativeEdge’s engine.

“Inference at the edge, which involves gathering data with AI techniques, is really important. Everybody needs to have a way to deploy and secure that. Also a company has to maintain its AI stack, the tools and services to use AI correctly. It must have a blueprint to update all the pieces of the puzzle,” says Chiodelli.

As the different components of an AI stack can change, a company must be aware of all of them and how they interact. This helps the company make the necessary adjustments in proportion and on the appropriate timeline. Such work prevents deviations in manufactured products and slowdowns in production time. It also minimizes the time needed to retrain AI models and workers.

The market for the edge is growing

Nvidia is working on numerous hardware and software applications to meet the needs of companies utilizing edge computing. The company sees this market as expanding. A March 2024 forecast from the International Data Corp. stated worldwide spending on edge computing is expected to be $232 billion this year.

One of Nvidia’s platforms for the edge is the Nvidia IGX Orin with NVIDIA Holoscan, which is designed for real-time AI computing in industrial and medical environments. This platform provides high performance hardware and enterprise AI software. The platform is for companies working in robotics, healthcare, scientific research, video analytics and broadcasting.

In scientific computing, the Nvidia IGX Orin with Holoscan platform has the power to stream high-bandwidth sensor data to the GPU. It can use AI to detect anomalies, drive sensor autonomy and lower the time to scientific insights. In the medical space, Magic Leap has already integrated Holoscan in its extended reality (ER) software stack to enhance the capabilities of customers. This has allowed one of its clients in software development to provide real-time support for minimally invasive treatments of stroke.

It’s difficult to establish interoperability across systems, says Chen Su, Senior Technical Product Marketing Manager of Edge AI and Robotics for Nvidia.

 “Today there are numerous installed legacy systems that weren’t originally designed with AI capabilities in mind. Integrating AI into those systems and still achieving real-time performance continues to pose a significant challenge. This can be overcome by developing industry-wide standards that can meet the complex connectivity requirements across sensors, actuators, control systems and interfaces,” says Su.

Once the task above is accomplished, the entire edge AI system will have no bottleneck in communication. It can then act in a software-defined manner, making the system more flexible and easier to manage.

STMicroelectronics (ST), a global manufacturer and designer of semiconductors, meets the needs of companies that process data in real-time and near real-time with a variety of edge AI tools and products.

These include STM32 and Stellar-E for microcontrollers (MCU) edge AI pure software solutions; the incoming STM32N6, a high-performance STM32 MCU with ST proprietary Neural Processing Units (NPU) and the STM32MP2 microprocessor series.

Danilo Pau, Technical Director in System Research and Applications at STMicroelectronics, says advances in embedded AI computing that enable processing at the edge require higher energy efficiency. The task is made possible by a mix of assets, including super-integrated NPU accelerators, In Memory Computing (IMC) and 18nm Fully Depleted Silicon On Insulator (FD-SOI) ST technologies. Such resources can be super integrated close to standard MCU and Memory Processing Unit (MPU) cores for viable, high volume low-cost manufacturing.

“There is also the super-integration of heterogeneous technologies in a single package achieved by Intelligent Sensor Processing Unis (ISPU) and Machine-Learning Core (MLC) product families. In a tiny package, micro-electromechanical systems (MEMs) sensors, analog and digital technologies are stacked for large and cheap sensor volumes. They engage in microwatt power consumption. This is a fundamental contribution that enables the incoming trillion of sensor economies envisaged by many IoT experts,” says Pau.

Organizations like tinyML Foundation play an important role in the business community. Since 2018, tinyML has encouraged many companies to invest in generative AI at the edge (edgeGenAI).

Pau says there is need of even greater energy efficiency and super integration of heterogeneous technologies, including NPU, IMC, deep submicron technologies and sensors.

“The vision is to design embedded systems that match the energy efficiency of the human brain,” says Pau.

He adds companies will increasingly need more education about edge AI technologies, tools and mastery of skills.

That fact explains why ST, which is currently Europe’s largest designer and manufacturer of custom semiconductors, is an active part of the tinyML community.

“ST works with many actors in the edgeGenAI ecosystem. We’re eager to see this ecosystem expand and serve AI developers in the best and most productive way. That will ease their path in bringing innovation to the edge AI market,” says Pau.

The post Processing at the edge takes off appeared first on Engineering.com.

]]>
Digital transformation is the starting point for the manufacturing marathon https://www.engineering.com/digital-transformation-is-the-starting-point-for-the-manufacturing-marathon/ Thu, 12 Sep 2024 20:39:39 +0000 https://www.engineering.com/?p=131806 Dassault Systèmes' Mike Buchli on why successful manufacturers need to think holistically about digital.

The post Digital transformation is the starting point for the manufacturing marathon appeared first on Engineering.com.

]]>

This video is brought to you by Dassault Systèmes.

To achieve success in manufacturing today, professionals must synchronize the application of capital, labor and equipment in a sort of industrial ballet, where the competing requirements of time, cost and speed are calculated and recalculated on a real-time basis. To get great outcomes, manufacturing processes must be planned carefully, and to stay successful, manufacturers most aggregate, analyze and act on multiple data sets generated by manufacturing processes and by the products themselves.

Knowing what to do with that data, and how to analyze it to generate actionable insight requires tools as complex as the production equipment itself. Simulation to model what if scenarios is now standard, and artificial intelligence is expected to make rapid inroads in manufacturing, and today everyone is talking about the digital twin. But that twin is not the end goal, it’s actually the starting place.

Jim Anderton discusses where manufacturers need to go once they leave that starting line with Mike Buchli,  3DEXPERIENCE WORKS manufacturing expert at Dassault Systèmes.  

* * * 

Learn more about how small and medium sized businesses can start benefiting from robotics automation that large enterprises have long enjoyed.

The post Digital transformation is the starting point for the manufacturing marathon appeared first on Engineering.com.

]]>