Digital Transformation - Engineering.com https://www.engineering.com/category/technology/digital-transformation/ Fri, 08 Nov 2024 18:40:42 +0000 en-US hourly 1 https://wordpress.org/?v=6.6.2 https://www.engineering.com/wp-content/uploads/2024/06/0-Square-Icon-White-on-Purplea-150x150.png Digital Transformation - Engineering.com https://www.engineering.com/category/technology/digital-transformation/ 32 32 Using AI at the edge to connect the dots in IIoT https://www.engineering.com/using-ai-at-the-edge-to-connect-the-dots-in-iiot/ Fri, 08 Nov 2024 18:40:40 +0000 https://www.engineering.com/?p=133768 Edge AI expert Jack Ferrari shares his insights on why this tech is a good fit for manufacturing operations.

The post Using AI at the edge to connect the dots in IIoT appeared first on Engineering.com.

]]>

For manufacturing facilities, an IoT strategy for equipment data collection and analysis is an essential step toward digital transformation, providing the data required to generate data-driven insights, predictive maintenance and other benefits. However, connecting machines and equipment to the internet raises challenges for IT teams, including security concerns, data storage, bandwidth and computing power.

To tackle these challenges, many teams consider whether to process data at the edge, or in the cloud. Historically, while the edge has benefits such as processing speed, low bandwidth and security, cloud solutions offer unmatched computing power. To use complex computing solutions such as AI models, you may think cloud is the only option. However, vendors like MathWorks are proving that AI at the edge can provide the best of both worlds.

Engineering.com recently spoke with Jack Ferrari, Edge AI Product Manager at MathWorks, to learn more about Edge AI and how manufacturers use it.

Engineering.com (Eng.com): What are the benefits of edge devices compared to ‘dumb’ sensors that just send data straight on to a pc or to the cloud?

Jack Ferrari (JF): Running AI models locally on edge devices instead of the cloud brings several benefits. First, the inference time (or, response time) of the model can be greatly reduced, as data is no longer required to be shuffled back and forth over the Internet. Secondly and for the same reason, edge AI enhances data privacy (all data fed to/from the model stays on the device) and makes applications more reliable (less prone to network outages). Finally and thirdly, edge AI can lower costs by reducing/eliminating cloud hosting and storage fees.

Eng.com: What trends and technologies drive edge AI’s adoption across industries?

JF: The large and rapidly growing number of IoT devices across industries (expected to reach 40 billion by 2030) are generating massive amounts of data at the edge, driving the need for local processing to handle data efficiently, reduce latency and lower cloud costs. Advancements in hardware, like AI accelerators and software, like new model compression techniques, are working in tandem to enable the adoption of edge AI.

Eng.com: Do you think industry-wide adoption of edge technology is driven by devices becoming more cost effective, energy efficient and/or powerful, or are edge trends driven by other, more strategic factors?

JF: A combination of the two is influencing edge AI’s adoption: On the technology side, new hardware platforms are being designed with AI workloads in mind. Recent advancements in microcontrollers (MCUs), digital signal processors (DSPs) and AI accelerators (like Neural Processing Units (NPUs) are enabling the deployment of models that were previously impossible to consider running at the edge. Besides simply having greater horsepower, these new chips are being optimized to execute AI workflows with greater energy efficiency. At the same time, the ecosystem of software tools used to compress AI models and program them on edge devices is becoming more robust and user-friendly, making the technology more accessible. Strategically, edge AI is enabling companies to differentiate their products in new ways. For example, by adding real-time processing and decision-making capabilities, enhancing device security by handling all data processing locally and enabling the personalization of AI models through techniques like on-device learning.

Eng.com: In many industries, security and IP concerns hold back adoption of AI tools. Is this seen in manufacturing?

JF: Security and IP concerns can indeed impact AI adoption in manufacturing. However, processing sensitive data at the edge (close to where it originates), rather than transmitting it to the cloud, can reduce exposure to potential breaches, offering a way to address these concerns.

Eng.com: What benefits can engineers expect when using edge AI?

JF: There are four primary benefits to using edge AI:

  • Lower latency: AI models can deliver predictions and classifications more quickly, which is crucial for engineers working on time-sensitive applications. This rapid response can enhance user experience and enable real-time decision- making, particularly in scenarios where milliseconds matter, such as autonomous vehicles or live data monitoring.
  • Lower costs: Reducing data transmission and storage fees, along with improved energy efficiency, leads to significant cost savings. For engineers, this means more budget can be allocated to other critical projects or resources and they can ensure their systems have higher uptime and availability, even during network outages, thus maintaining service continuity.
  • Enhanced privacy: By processing incoming data on-device instead of transmitting it to the cloud, engineers can ensure higher levels of data privacy and security. This is particularly beneficial in industries where sensitive information is handled, as it reduces the risk of data breaches and ensures compliance with privacy regulations, making it easier to protect user data.
  • Improved reliability: As edge AI does not rely on continuous cloud connectivity, it can continue to function during network outages. This ensures that critical operations, like monitoring and control systems in manufacturing, remain active even if the cloud connection is lost.

Eng.com: What are common challenges associated with edge AI?

JF: While it’s becoming easier to implement AI models on the edge, organizations should be mindful of several challenges that accompany the technology:

  • Resource constraints: Edge devices typically have limited processing power, memory and storage. Complex AI models may run slowly or not at all. To mitigate this, proper care should be taken in selecting model architectures that are well-suited for the edge device they will eventually be deployed to. Additionally, models can be further optimized for edge deployment with compression techniques like projection, pruning and quantization.
  • Model deployment: Translating AI models from the high-level languages where they are defined and trained (like Python or MATLAB) to low-level languages that can be compiled to run on edge devices (like C or C++) can be challenging. MathWorks tools facilitate this process by automating the conversion, ensuring efficient deployment on a diverse range of hardware. For example, Airbus used GPU Coder to deploy deep learning models, trained in MATLAB for defect detection, onto embedded GPUs. GPU Coder automatically translated their MATLAB code into the corresponding CUDA code, which could be compiled and run on their embedded system.
  • Model maintenance: After deploying AI models to the edge, organizations should have a plan for keeping them updated over time. This can take several forms:
  • Over-the-air (OTA) updates, where new model files and weights are sent to edge devices over a network connection.
  • On-device training (or, incremental learning), where models are updated and refined directly on the device using local data, allowing for personalization without the need to communicate with the cloud.

Eng.com: Are there edge AI use cases that are applicable across multiple industries?

JF: Beyond classic examples like image classification, object detection and semantic segmentation, one interesting application of edge AI MathWorks is seeing used across industries are virtual sensors (or, software sensors). AI-based virtual sensors can be used to infer sensor data that might be difficult or expensive to measure directly, by analysing data from other sensors in real-time. One great example is for estimating the state of charge of a battery. While difficult to measure directly, it can be inferred from the values of other, more easily attainable values, like current, voltage and operating temperature. By using AI models trained on historical battery performance data, the virtual sensor can predict the state of charge more accurately and adapt to changes in battery health and usage patterns, providing real-time insights without the need for additional hardware. Virtual sensors are applicable to multiple industries, including automotive, aerospace, manufacturing and healthcare. As another example, Poclain Hydraulics used MATLAB to design and deploy a neural network-based virtual sensor for monitoring the temperature of motors used in power machinery.

Eng.com: Do you think that the trend toward AI and cloud-based IoT systems make custom-built systems into dinosaurs, in other words would a manufacturer be ‘crazy’ to consider building a solution in-house?

JF: While AI and cloud-based IoT systems offer scalable and cost-effective solutions, the decision to build a system in-house depends on a manufacturer’s specific needs and capabilities. Companies with specialized requirements or strong internal expertise may benefit from custom solutions, while others might prefer the speed and lower upfront costs of cloud-based platforms. Ultimately, the choice hinges on factors like customization, security and time to market.

Eng.com: As the complexity increases of the devices we use to monitor and maintain our equipment, is there a growing need to monitor and maintain the edge and IT devices as well? How do we do that?

JF: Yes, as the complexity and number of AI-enabled edge devices increases, so does the need for monitoring and maintenance. Over time, the input data to AI models can drift or differ significantly from the data they were originally trained on, negatively impacting model accuracy and performance. Organizations should anticipate this and consider approaches to continuously update their models, whether through OTA updates or incremental learning.

For more on edge AI, check out https://www.engineering.com/edge-ai-solutions-every-engineer-should-know-about.

The post Using AI at the edge to connect the dots in IIoT appeared first on Engineering.com.

]]>
How should I design my IIoT architecture? https://www.engineering.com/how-should-i-design-my-iiot-architecture/ Tue, 05 Nov 2024 14:39:30 +0000 https://www.engineering.com/?p=133620 The goal for companies starting out should be flexibility, interoperability and incremental investment

The post How should I design my IIoT architecture? appeared first on Engineering.com.

]]>
At the heart of an effective IIoT system is a modular architecture. This approach allows manufacturers to implement plug-and-play devices, which can be added or removed as necessary. For example, if a facility wishes to introduce a new type of sensor to monitor machine performance, it can do so without overhauling the entire system. This flexibility enables incremental upgrades that enhance capabilities progressively.

In addition, adopting a microservices architecture means that each component of the IIoT system operates independently. If a particular service—such as data collection or processing—needs improvement, it can be scaled or replaced without affecting the entire infrastructure. This targeted enhancement ensures that the system evolves alongside operational needs, fostering innovation and responsiveness.

Flexible data management

As data volumes increase, flexible data management becomes essential. Leveraging cloud solutions allows manufacturers to tap into virtually limitless data storage and processing capabilities, accommodating the influx of information from a growing number of IIoT devices. This scalability ensures that data can be collected and analyzed efficiently, supporting informed decision-making.

Moreover, integrating edge computing allows for local data processing. By analyzing data closer to where it is generated, manufacturers can reduce latency and bandwidth demands, resulting in quicker response times and more efficient analytics. This setup is particularly beneficial for real-time applications, where immediate insights can drive operational improvements.

Interoperability

To maximize the benefits of IIoT, interoperability is crucial. By adopting standard communication protocols like MQTT or OPC UA, new devices integrate seamlessly with existing systems. This standardization reduces compatibility issues and simplifies the addition of new technologies.

Open APIs further facilitate integration by connecting diverse applications and devices. This approach not only enhances the system’s scalability but also promotes innovation by enabling third-party developers to contribute new functionalities.

Adaptive network infrastructure

A robust networking infrastructure is essential for supporting the growth of IIoT systems. Investing in scalable solutions, such as 5G or private LTE, ensures the network can handle a number of connected devices without sacrificing performance. These high-capacity networks facilitate rapid data transfer, which is critical for real-time operations.

Mesh networking can also enhance connectivity. As the number of IIoT devices increases, a mesh network can improve reliability and coverage. The devices can communicate more effectively with each other and with central systems.

User-centric design

A focus on user-centric design is essential for an IIoT system to be accessible and useful. Developing intuitive interfaces enables users to interact with complex data and analytics. As new functionalities and devices are integrated, these interfaces should remain adaptable.

Customization options allow users to tailor their dashboards and data presentations. This flexibility ensures employees can concentrate on the metrics that matter most to their specific roles, enhancing productivity and engagement.

Incremental investment

By planning for phased implementation, organizations can gradually adopt IIoT technologies, assessing results and adjusting the strategy based on initial deployments. This method reduces the risk associated with large-scale changes and enables organizations to learn and adapt as they progress.

Starting with pilot programs provides an opportunity to test scalability in real-world conditions. These initial tests inform future investments and expansions, ensuring that the overall strategy aligns with operational goals.

Collaboration and ecosystem engagement

To keep pace with technological advancements, manufacturers must engage in collaboration and ecosystem engagement. Partnering with technology providers and stakeholders ensures that the IIoT ecosystem can evolve together, sharing insights and best practices.

Active community engagement in industry forums and engineering-focused websites such as Engineering.com helps manufacturers stay updated on emerging technologies and methodologies that facilitate scaling. By participating in these discussions, organizations can learn from the experiences of others and implement strategies that drive success.

Training and support

As systems evolve, training and support become critical. Providing continuous training for staff ensures that employees can effectively navigate new technologies and systems. This investment in human capital is essential for maximizing the benefits of IIoT.

Additionally, ensuring access to technical support helps organizations address challenges that arise during scaling. Support teams can assist with integration and troubleshooting, allowing manufacturers to focus on their core operations.

Feedback and iteration

Establishing feedback mechanisms is crucial for ongoing improvement. By collecting input from users and stakeholders, manufacturers can implement iterative enhancements to their IIoT systems as they scale. This feedback loop fosters a culture of continuous improvement and adaptation.

Encouraging an adaptation to change within teams is also vital. By promoting a culture that embraces innovation and is open to new ideas, organizations can optimize their IIoT implementations over time, ensuring they remain responsive to evolving operational needs.

The breakdown

Here’s a simplified (but not simple) breakdown of the typical layers in this type of modular architecture:

1. Device Layer (Edge Layer)

Components: Sensors, actuators and other smart devices.

Function: Collects data from machinery and equipment and may perform local processing to filter or aggregate data before transmission.

2. Connectivity Layer

Components: Communication protocols and network infrastructure.

Function: Facilitates communication between devices and central systems using wired (e.g., Ethernet) or wireless technologies (e.g., Wi-Fi, Bluetooth, LoRaWAN, cellular).

3. Data Ingestion Layer

Components: Gateways and edge computing devices.

Function: Manages the transmission of data from edge devices to cloud or on-premises servers, handling data aggregation and initial processing.

4. Data Processing and Analytics Layer

Components: Cloud or on-premises servers equipped with data analytics and machine learning tools.

Function: Analyzes the ingested data for insights, predictive maintenance, and operational optimization, utilizing advanced algorithms and models.

5. Storage Layer

Components: Databases and data lakes.

Function: Stores historical data for analysis, reporting and compliance, supporting both structured and unstructured data types.

6. Application Layer

Components: User interfaces, dashboards and applications.

Function: Provides tools for visualization, reporting, and user interaction, enabling stakeholders to make informed decisions based on data insights.

7. Security Layer

Components: Security protocols, encryption and access controls.

Function: Ensures data integrity and confidentiality, protecting the system from cyber threats and unauthorized access at all layers.

8. Integration Layer

Components: APIs and middleware.

Function: Enables integration with existing enterprise systems (like ERP, MES, and SCADA) for seamless data flow and operational coherence.

Wrap up

This layered modular architecture provides flexibility and scalability, allowing manufacturers to implement IIoT solutions tailored to their specific needs. By clearly defining each layer’s role, organizations can enhance interoperability, maintain security, and ensure that data flows effectively from devices to actionable insights. This structure facilitates incremental upgrades and the integration of new technologies as they become available.

The post How should I design my IIoT architecture? appeared first on Engineering.com.

]]>
Siemens’ Altair play: strategic AI move or simulation catch-up? https://www.engineering.com/siemens-altair-play-strategic-ai-move-or-simulation-catch-up/ Mon, 04 Nov 2024 15:55:55 +0000 https://www.engineering.com/?p=133577 For Siemens, the challenge lies in more than simply acquiring AI—it’s about operationalizing it.

The post Siemens’ Altair play: strategic AI move or simulation catch-up? appeared first on Engineering.com.

]]>
Siemens’ acquisition of Altair Engineering, a leader in Artificial Intelligence (AI), simulation, and high-performance computing (HPC), reflects a bold ambition to strengthen its AI-driven industrial software portfolio. As Tony Hemmelgarn, President and CEO at Siemens Digital Industries Software, said: “this will augment our existing capabilities with industry-leading mechanical and electromagnetic capabilities and round out a full-suite, physics-based, simulation portfolio as part of Siemens Xcelerator.”

With a foundation already set in AI and generative AI capabilities, Siemens is taking a strategic leap to deepen its offerings in areas such as Product Lifecycle Management (PLM) and Digital Twins.

Yet, the acquisition raises critical questions: Is Siemens advancing its strategic edge by embedding next-level AI and knowledge graph technologies, or is it scrambling to keep up in a landscape that is moving faster than ever?

Elevating AI-driven PLM and digital twins

Siemens’ integration of Altair’s powerful AI, simulation and high-performance computing tools into its PLM tech suite, particularly within Teamcenter and Simcenter, offers a potential transformation in how digital twins and simulations are used across engineering and manufacturing. Altair’s deep expertise in physics-based simulations, including mechanical and electromagnetic modeling, could allow Siemens to develop more sophisticated digital twins that not only represent physical products but also predict behaviors and outcomes with high fidelity.

With Altair’s technology, Siemens can push digital twin capabilities beyond basic visualization and monitoring, creating a system that incorporates real-time data, predictive analytics and adaptive simulations. This would enable manufacturers to make informed, AI-driven decisions at every stage of the product lifecycle, from design and development to production and maintenance.

However, despite Siemens’ existing portfolio, which includes substantial AI and generative AI tools, the acquisition raises a critical question—how effectively can Siemens embed these capabilities as a core, transformative feature within its PLM platform? Without a clear path to seamlessly integrate AI across its offerings, Altair’s capabilities risk being relegated to auxiliary add-on features, potentially limiting their business impact. For Siemens, this move is more than just adding tools; it’s about embedding intelligence deeply within the end-to-end PLM framework, making AI a central component of its digital transformation strategy.

Enhancing digital twins with HPC

Siemens is marketing itself as a leader in digital twin technology, primarily through its Xcelerator platform, which integrates real-time operational data to improve asset management, production efficiency and product quality. Altair’s HPC capabilities could significantly enhance Siemens’ digital twin offerings by allowing more complex, detailed, and faster simulations—an essential component of predictive maintenance and optimization for manufacturers.

The integration of HPC into Siemens’ digital twin ecosystem could be transformative, enabling simulation models that accommodate an unprecedented scale of data and complexity. For instance, manufacturers could simulate entire production lines or supply chain networks, gaining insights that help them optimize operations, reduce energy consumption, minimize downtime and predict implications from product changes. This is particularly relevant as industries move toward more sustainable and resilient operations.

However, leveraging Altair’s HPC across Siemens’ existing infrastructure poses some challenges. HPC solutions typically require specialized infrastructure, substantial processing power and technical expertise. Siemens will need to carefully consider how to bring HPC capabilities into mainstream use within its portfolio, including positioning within its maturing SaaS offering. The risk here is that without a robust integration plan Altair’s HPC tools may remain isolated and less affordable, providing limited impact and reducing the transformative potential of this acquisition.

Knowledge graph technology: connecting data with digital thread

Altair’s recent acquisition of Cambridge Semantics, a developer of knowledge graph and data fabric technologies, brings new dimensions to the integration of enterprise data across complex manufacturing ecosystems.

Knowledge graphs provide a framework for Siemens to unify and contextualize vast amounts of data from disparate systems—an essential step for effective AI-driven insights and accurate digital twin models. With knowledge graphs, Siemens could break down data silos, connecting information from PLM, digital twins, and other systems into a cohesive whole, creating a seamless digital thread across the lifecycle.

Incorporating Cambridge Semantics’ knowledge graph technology into Siemens’ portfolio could lead to a new era of “data-rich” digital twins, where structured and unstructured data come together to provide a more comprehensive, actionable view of products, assets and operations. By grounding generative AI models in real-world data, knowledge graphs could improve response quality and deliver contextual insights, allowing engineers and operators to make better, faster decisions.

Yet, the question remains: can Siemens adapt this advanced data integration technology effectively in an industrial setting? Cambridge Semantics’ data fabric has been proven in sectors like defense, life sciences, and government. Adapting it for manufacturing will require Siemens to navigate industry-specific complexities. Without careful implementation, the risk is that knowledge graph technology will be underutilized—merely another tool rather than a strategic game-changer in Siemens’ PLM and digital twin offerings.

Strategic opportunity or catch-up?

The acquisition of Altair could empower Siemens to lead in AI-driven PLM, high-fidelity simulations and data-enriched digital twins. But the road ahead demands more than technological additions; it requires Siemens to deeply integrate these capabilities within its core platforms and ensure they serve as transformative, essential components rather than optional add-ons.

For Siemens, the challenge lies in more than simply acquiring AI—it’s about operationalizing it. By embedding Altair’s and Cambridge Semantics’ technologies as central pillars in its software ecosystem, Siemens has the opportunity to redefine industrial intelligence in manufacturing. Can Siemens realize this vision to become a true leader in AI-driven industrial software, or will it struggle to fully leverage these assets, ending up as a late entrant in a rapidly advancing field?

The post Siemens’ Altair play: strategic AI move or simulation catch-up? appeared first on Engineering.com.

]]>
Focus on tech performance when training for digital transformation https://www.engineering.com/focus-on-tech-performance-when-training-for-digital-transformation/ Fri, 01 Nov 2024 15:06:21 +0000 https://www.engineering.com/?p=133532 Training must be based on ensuring the performance objectives of digital transformation are achieved.

The post Focus on tech performance when training for digital transformation appeared first on Engineering.com.

]]>

Digital transformation is the term used to describe the change that’s taking place in organizations in response to the Fourth Industrial Revolution. Until recently, organizations were built to reliably do tomorrow what they did today. They were not designed for our era of rapid technological change. For organizations to succeed in our new age, they need to innovate and respond to the rapidly changing environment they face. Many of the skills that have enabled them to be successful in the past remain important in repeatedly producing at a good price and quality, but there are a range of new skills that are needed.

In a recent survey conducted by Seatle-based IT services firm Veeam Software, 54% of organizations report that “lack of IT skills or transformation expertise” is a challenge with their digital transformation. (source: https://go.veeam.com/wp-data-protection-trends-2024 ) They feel they don’t have the basic capabilities for technology-based change. While recruitment will often be helpful in addressing this, most organizations understand that training and retaining their own people is going to be important.

Discussions and articles on digital transformation skills today help us understand the skills development areas that organizations think are important. Often, the emphasis is on specialist technical skills, change management, creativity, adaptability and broad digital literacy.

These are based on the technical needs for implementing, operating and maintaining the new technology, managing the change with employees, developing creativity that will help exploit the technology and making everyone more comfortable with and able to use it in their jobs. These areas are all relevant, but they are not sufficient for effective digital transformation. Rather than only focusing on technology introduction, training needs to be based on ensuring the performance objectives of its introduction are achieved.

Focus on strategic goals

Digital transformation should be based on the organization’s strategic goals. Strategic analysis, including understanding of the market environment, the range of technologies and their uses and the internal capabilities in the organization, is needed to determine organizational priorities for digital transformation. Introduction of technology to achieve these priorities requires a systems approach, integrating understanding across the organization to make decisions that take into account their strategic implications and possibilities.

Implementation requires collaboration within and between existing organizational silos. As implementation proceeds decisions with organization wide implications may be made – for instance on production lines where automation of one area may reduce or change work in another.

Once implementation is complete, exploitation of technologies requires more collaborative work – artificial intelligence may provide insights based on your data but this will only turn into value if you have the capability to take action based on it. In many organizations, improvement activity is slow and contentious and requires a much more supportive environment. Even with this, frequent process modifications without careful collaboration will create chaos as changes are made without adequate understanding of their implications. Collaboration and teamworking skills, supported by organizational leadership fostering a supportive environment are critical to digital transformation.

Within Lean and Agile operating systems models there is often understanding that training employees in a wider range of skills is needed to enable them to participate in innovation and continuous improvement. If each employee only knows a small part of the overall production process, they are less likely to have viable improvement ideas and be able to work with their colleagues to implement them.

In digital transformation this is even more important. Employees must have a better technical understanding, not just of new technologies, but of the processes they use themselves today and those in their general work area. Technical training on existing processes is a vital part of digital transformation.

Broader knowledge of the market environment and the organization’s overall strategy should inform all digital transformation activity, including technology focused innovation and continuous improvement. Regular, compulsory training to do this, supported by an effective communications process, is necessary to achieve an adequate level of understanding.

DX performance training categories

To help you consider the training needed for your digital transformation we have developed the following categories. Training for each work area should be considered in all of these, and plans developed to achieve the capability needed. Using the categories will enable you to reflect on the needs in each area and plan the training that each employee will receive. The categories are:

Tech Awareness: Awareness of the basic characteristics and uses of the main technologies available to organizations today. In a world where technological innovation is the main factor in competitive success, technological familiarity are essential.

Systems Awareness and Thinking: When organizations changed slowly there was plenty of time to think carefully about and adapt to the systems implications of gradual technological change. Digital transformation requires everyone to have greater knowledge of the integrational aspects of organization processes. This can be achieved through training that creates awareness of organization wide processes. Given that these processes will be changing more often in the future, this training will need to be updated and repeated regularly.

Technical Skills: Participation in process improvement that applies and exploits technology, for example making changes based on the results of AI analysis of corporate data, requires that employees have more technical understanding of their own work processes and the ability to contribute to modifications.

Collaboration and Innovation Skills: Collaboration, bringing together knowledge and skills from across the organization, is needed to effectively implement new technologies and maximise their contribution to performance.

Corporate and Market Knowledge: In most organizations successful digital transformation requires increasing empowerment at all levels. For example, acting on the increased knowledge of areas for improvement that is provided by the internet of things and AI cannot rely on all actions being approved at a senior level. Empowerment can only work if employees have a good understanding of organizational priorities (its strategy) and the context in which they are being pursued (the market).

These categories are a useful framework for considering your own training activity. Their application will be based on your own conditions. Many training options exist today that make corporate training easier. Sending people to off-site classes and losing their valuable contribution to today’s production is much less often needed than in the past. On site and / or online options are now available that enable high quality training to be much less disruptive to production and many companies are taking advantage of it. AR and VR will play an increasing role in the future.

The post Focus on tech performance when training for digital transformation appeared first on Engineering.com.

]]>
More digital transformation ideas for raising productivity https://www.engineering.com/more-digital-transformation-ideas-for-raising-productivity/ Wed, 30 Oct 2024 20:29:48 +0000 https://www.engineering.com/?p=133439 Many organizations fail to recognize opportunities to reduce costs and optimize resources.

The post More digital transformation ideas for raising productivity appeared first on Engineering.com.

]]>

Digital transformation reshapes many industries by changing how organizations operate and deliver customer value. Many engineers have recognized the compelling benefits of digital transformation, including increased productivity. Through advanced software, automation, data analytics, and enhanced connectivity, digital transformation enables businesses to operate more efficiently, innovate faster, and deliver better outcomes.

Below, we explore more ways in which digital transformation boosts productivity. To read the first article in this series, click here.

Reduce costs and optimize resources

Digital transformation often leads to significant cost savings by reducing waste, improving asset utilization, and optimizing resource allocation. Through digital technologies like the Industrial Internet of Things (IIoT), simulation and digital twins, engineers can monitor equipment and processes in real-time, ensuring that resources like energy and materials are used more efficiently.

Predictive maintenance, enabled by IIoT sensors and AI, helps businesses reduce downtime and prevent costly repairs by identifying issues before they become critical. Avoiding unscheduled outages contributes significantly to production productivity. Cloud service providers (CSP) can reduce IT infrastructure costs, which have become a material expense, by allowing businesses to:

  • Pay for only the CSP resources they use.
  • Avoid investing in an on-premise computing environment.
  • Avoid operating costs for an on-premise computing environment.
  • Utilize a CSP computing environment with superior cybersecurity defenses.
  • Access enormous CSP computing resources instantly when needed.

Leading simulation software vendors include Avena, Autodesk, Dassault Systèmes, GE, and Siemens. Leading CSPs include Amazon, Google, IBM, and Microsoft.

Boost collaboration and communication

Engineers and others struggle to collaborate with staff and external partners due to data-sharing limitations and incompatible technologies. Sometimes, well-intentioned security measures become impediments.

Digital transformation introduces tools that enhance communication and collaboration across teams, departments, and geographical locations.

This interconnectedness reduces bottlenecks, shortens project timelines, and fosters a more agile workplace where engineering teams can collaborate more productively. With the rise of remote work, these tools have become even more critical, allowing businesses to maintain productivity even when their employees are not physically present in their respective offices.

Cloud-based collaboration software includes MindMeister, Miro, Microsoft Teams, Slack and Zoom. Project management software includes Asana, Microsoft Project, and Trello. This software makes it easier for employees to work together in real time, regardless of location.

Bolster agility and flexibility

Many organizations are comfortable with their current processes. That comfort often precludes an agile response when changes in the business environment threaten the business plan.

Digital transformation equips engineers with the agility to respond rapidly to changing market conditions, technological disruptions, and customer needs because the needed data is immediately accessible. In the past, engineers would often have to wait weeks or months to implement responses or launch new products, but digital tools enable them to do so in days or even hours.

For example, cloud computing allows businesses to scale up or down based on demand, enabling them to be more responsive to fluctuations in market needs. Similarly, AI and data analytics enable businesses to pivot quickly based on real-time data insights. This flexibility enhances productivity by ensuring businesses allocate resources optimally and capitalize on opportunities as they arise.

Leading software vendors for data analytics include Google Data Studio, Microsoft Power BI, Minitab, Tableau, TIBCO and TrendMiner. Leading AI software vendors include Anthropic, Google, IBM, Meta Platforms, Microsoft and Open AI.

Empower the workforce and build skills

The workforce, including engineers, frequently feels hemmed in by narrowly defined roles, inadequate digital tools and stifled by a ponderous top-down decision-making culture.

Digital transformation drives productivity by empowering employees with better tools and access to information. With the right digital tools, employees can complete tasks more efficiently, collaborate more easily, experiment and make better decisions.

Digital transformation often opens opportunities for employees to upskill or reskill, enabling them to perform new roles or handle more complex tasks while improving productivity.

Many businesses are investing in training and development programs that teach employees how to use advanced technologies like generative AI, ML, and data analytics tools. This training enhances individual productivity and positions the organization to adapt quickly to technological changes.

Leading immersive training software vendors include AllenComm, EI, ELB Learning, Learning Pool and SweetRush.

Digital transformation is a fundamental shift in how businesses operate and deliver value. It has proven to be a powerful driver of productivity, enabling businesses to streamline processes, automate mundane tasks, make data-driven decisions, enhance customer experience, and boost collaboration and communication. It’s not just a technological upgrade.

Organizations that harness digital transformation’s power will enjoy sustained productivity gains and long-term success. They are better equipped to navigate the complexities of the marketplace, innovate faster, and remain competitive.

The post More digital transformation ideas for raising productivity appeared first on Engineering.com.

]]>
Applying AI in manufacturing: Q&A with Jon Hirschtick https://www.engineering.com/applying-ai-in-manufacturing-qa-with-jon-hirschtick/ Thu, 24 Oct 2024 17:11:24 +0000 https://www.engineering.com/?p=133231 Onshape CEO and CAD legend Jon Hirschtick talks about his approach to AI and how manufacturers can extract value.

The post Applying AI in manufacturing: Q&A with Jon Hirschtick appeared first on Engineering.com.

]]>
Onshape CEO Jon Hirschtick. (Image: PTC)

The general sentiment around the usefulness of artificial intelligence (AI) has seen it’s ups and downs over the last couple of years. After bursting into public consciousness in late 2022, the hype has subsided. As we enter the final stretch of 2024, the current thinking is that AI is in bubble territory and companies should be wary of putting too much stock into its potential.

This is sound advice, no doubt. But as with any burgeoning tech, the early value is often found in the margins, helping companies gain an edge rather than bringing groundbreaking change. This holds true with AI—it won’t change society as we know it, at least not yet. But when applied to the right niche it could have significant impact.

The U.S. manufacturing sector, which was estimated to be worth about $2.5 trillion in 2022 by the National Association of Manufacturers, is one such niche that could reap significant rewards from a thoughtful and intentional approach to investing in AI.

First off, AI in manufacturing isn’t new, it just goes by a different name—machine learning. Secondly, much of the AI-based functionality being developed for manufacturing is done by major companies with well-funded R&D divisions and a strong foothold in the manufacturing market.

Indeed, virtually all of the major design software firms are finding ways to incorporate generative AI and large language models (LLMs) into their products. One such example is Boston-based software developer PTC’s cloud CAD system Onshape.

We caught up with Onshape’s CEO Jon Hirschtick to talk about its entry into the AI playground, how AI can bring value to manufacturers and what he sees in the near future for the still nascent technology. This interview was edited for clarity and conciseness.

Eng.com: How is PTC approaching AI technology?

Jon Hirschtick (JH): PTC is doing a lot with AI. We shipped some AI-based functions in our products and have a bunch of exciting AI-based projects under development, such as Onshape AI Advisor. We’re also active in the research community, where PTC employees are involved in research papers that are public. Our tools are being used by the AI community, perhaps even more than any other tools in our industry, for AI research. That’s another exciting dimension. There’s a ton of work and applications happening, and I’m proud that PTC isn’t getting overly hyped about AI, pushing things out before they’re ready. We’re focused and taking a solid approach.

Eng.com: So, at IMTS, you announced the Onshape AI Advisor, which is essentially an AI CAD assistant. Can you tell me why AI is a good fit for this application?

JH: With the Onshape AI Advisor, we’re using AI to provide expert user advice for fairly complex questions about how to use Onshape. These are the kinds of questions users typically ask another user or contact support or technical services. Instead of that, they can type a plain English question, even one that’s sophisticated, and get an answer about how to use a particular technique. AI is really good at this kind of task. It’s not rocket science anymore—it’s clearly within the wheelhouse of AI. This is very valuable for our users. We have deep professional CAD and PDM capabilities, and this is another layer of assistance.

Eng.com: Just to clarify the Onshape AI Advisor—is it a Q&A tool for users, or are you moving toward AI generating designs or doing the grunt work?

JH: It’s more than just an FAQ. It’s a conversational tool—you can ask questions, and the AI helps you with specific tasks, not just pre-set answers. As for generating designs, we’re working on it, but it’s not quite there yet. We’ve got great demos and research, but generating robust 3D geometry for manufacturing is still too complex. In entertainment, sure, you can generate digital assets, but for industrial applications, it’s not ready yet. There are too many safety concerns and error possibilities right now.

Eng.com: How long ago did you identify AI as something that needed to be developed for your product, your company and your users?

JH: It depends on how you define AI. If you define AI as machine learning, I’d say it goes back more than five years, when we started using machine learning to understand customer satisfaction through behavior. As for generative AI, almost from the moment it became known in research, we were involved in research papers, not necessarily knowing whether it would turn into real products. We had many people in the AI research community saying Onshape is the perfect platform for AI research because we’re the only cloud-native CAD and PDM system. Everything’s available through a REST API, so you don’t have to install a lot of software for large-scale usage, and with the right license agreement, you can access our public data library. Some people want to train on that 15 million human-created CAD and PDM documents. So, the research interest and commercial application started almost from day one. Over the last year or two, we’ve realized we could actually start shipping some of these things.

Eng.com: Absolutely. Your trajectory here matches many others who have integrated AI into their products. Can you talk a bit about the importance of further developing this and creating a baseline of AI capabilities to build on?

JH: I think it’s important that we explore what AI can do, and start shipping these products to customers because it’s new tech. I think AI is critical. I think our users must feel like product developers did when plastics or carbon fiber came along. It’s not just a better way of doing things; it’s a whole new set of tools that make you redefine problems. It allows you to approach problems differently. And so, the baseline is important not only for study but for releasing products. Just like with the first plastic product, you can’t know what it’s really like until you use it. We need to build reps, understand how to deliver and leverage the cloud-native solutions of Onshape.

Eng.com: The way you describe the Onshape AI Advisor brings to mind the concept of tribal knowledge within an organization. For years, we’ve talked about how companies lose access to knowledge due to retirement or attrition. This seems like a repository for that knowledge. It learns what people are doing and stores it for future users, correct?

JH: Yes, but it’s only a partial step. The AI assistant can give access to knowledge that a user might have taken with them when leaving the organization, so it helps there. But we’re far from tackling the whole problem. In the future, we might be able to develop a model specific to a company’s use of Onshape. Right now, the AI assistant only looks at general Onshape user knowledge—it doesn’t look at your specific data. But eventually, we could create a model around your company’s practices and provide insights like, “Here’s how experts in your company apply this technique.” So, it’s a partial step in that direction, but there’s a lot more we can do. We’re starting with Onshape AI, and there are other PTC products, like ServiceMax, which are also capturing expert knowledge.

Eng.com: It sounds like what you’re describing is essentially custom AI agents for a customer. Is that something on the horizon?

JH: Possibly. We’re not announcing anything yet, but if you ask me to speculate, I’d say it could definitely be part of our vision. PTC has a real leg up here, given our cloud-native infrastructure and the highly secure services we operate. We’ve been doing this for a while with Onshape and our PDM systems. So yes, we could imagine something like a custom AI agent for a company, where the AI looks at the best users in the company and helps new users align with those best practices. That’s something we could do in the future, based on the data we collect. It could be something like “What’s the best way to apply this technique in our company?” AI could inform decisions based on data, helping save time and improving efficiency.

Eng.com: AI agents are slowly becoming a thing for companies, but you need sensors and other devises to collect data for the agent. Is there a way around that that?

JH: With Onshape, we don’t need to go through that manual process of collecting data. Our system captures every single action as a transaction—if you drill a hole, undo it, or modify a feature, that’s all tracked. We have more data than any other system about a user’s activity, so we don’t need to go out and collect data manually. This gives us a huge advantage in training AI applications. In the future, users might even be able to combine data from their channels, emails, and other sources, and create a composite picture of what’s happening. We’re working on ways to give more value without relying on the kind of manual collection you mentioned.

Eng.com: Another PTC product named Kepware is a software layer that collects and aggregates shop floor data. Is this something you’re planning on incorporating into your AI products in the future?

JH: I can’t announce anything specific at the moment, but Kepware is well-positioned for this and there’s definitely potential there. Kepware handles sensor data, and we handle digital data from our systems. Combining those sources could be very powerful. The point is that PTC is uniquely positioned with both Onshape and Kepware, which span the digital thread spectrum. We’re also working with ServiceMax, which collects a lot of field service data, and they’re looking into AI for capturing service expertise as well.

Eng.com: Moving on to AI and digital transformation—what have you learned about AI’s application in manufacturing while attending IMTS 2024?

JH: At IMTS, I learned that we’re still in the early days of AI in product development and manufacturing. The promise is huge, but most organizations aren’t using a lot of AI yet. That being said, the number of projects people are working on is incredible. Everyone’s still figuring out which use cases work best, balancing doability, usability and value. I saw companies using AI in some exciting ways, like summarizing manufacturing data or configuring industrial products, but the technology isn’t fully ready for more complex tasks yet. I think the next few years will be about figuring out what works and refining those applications.

Eng.com: Do you think smaller companies are adopting AI faster than larger ones?

JH: Absolutely. The smaller companies tend to be more agile, and AI doesn’t necessarily require a huge investment in hardware or infrastructure. With Onshape and the AI assistant, you don’t need big machines or complicated installations. Smaller companies can jump in and take advantage of the latest tools without needing millions in capital. They’re moving faster in many cases, though not always.

The post Applying AI in manufacturing: Q&A with Jon Hirschtick appeared first on Engineering.com.

]]>
How to plan data collection, storage and visualization in an IIoT deployment https://www.engineering.com/how-to-plan-data-collection-storage-and-visualization-in-an-iiot-deployment/ Mon, 21 Oct 2024 19:32:23 +0000 https://www.engineering.com/?p=133070 Be sure to consider scalability and future-proofing to accommodate evolving manufacturing processes and technologies.

The post How to plan data collection, storage and visualization in an IIoT deployment appeared first on Engineering.com.

]]>

When it comes to an IIoT (Industrial Internet of Things) implementation in manufacturing, data collection, storage, analytics and visualization are the core backplane that drives actionable insights and enables smarter operations.

How do these components typically align in an IIoT system and what considerations should a manufacturing engineer should keep in mind when planning an implementation? It can certainly get complicated, but breaking things down into their smaller parts makes it more manageable.

Data Collection

The effectiveness of data collection largely depends on sensor architecture. Depending on the equipment or process, various types of sensors (temperature, pressure, vibration, etc.) need to be deployed across critical points in the manufacturing process. Ensure sensors are selected with appropriate accuracy, environmental tolerance and response time for the specific application.

A Data Acquisition Systems (DAS) act as an interface between these sensors and the IIoT platform. It gathers real-time data from sensors and transmits it to the edge or cloud infrastructure. The big decision here is whether to use edge processing (local data pre-processing) or rely on centralized data gathering at the cloud level. Edge processing offers lower latency, making it ideal for real-time tasks. It also reduces bandwidth needs by processing data locally. However, it requires more upfront investment in hardware and can be harder to scale. In contrast, cloud processing handles large data volumes more easily and scales better, though it comes with higher latency and ongoing costs for bandwidth and storage. Cloud systems also need robust security measures for data transmission. A hybrid approach combining both edge and cloud processing might be an option that balances real-time processing with scalable, centralized data management, but it depends on each application and the desired outcomes.

The next big decision is to determine the optimal sampling rate. Too high of a sampling frequency can overwhelm your storage and bandwidth, while too low may miss critical insights, particularly in dynamic manufacturing processes. Work with process engineers to determine the data sampling frequency based on process variability. Test this often to ensure what you think is the optimal sampling rate isn’t leaking potential value.

If you are going to base major decision off the insights gained through this IIoT system, you must ensure the integrity of collected data. This means that error checking (e.g., using checksums or hashing) and redundancy mechanisms (e.g., backup data paths or local buffering) are in place to handle network failures or sensor malfunctions.

A checksum is a small-sized piece of data derived from a larger set of data, typically used to verify the integrity of that data. It acts as a digital fingerprint, created by applying a mathematical algorithm to the original data. When the data is transmitted or stored, the checksum is recalculated at the destination and compared with the original checksum to ensure that the data has not been altered, corrupted or tampered with during transmission or storage.

Hashing is the process of converting input data into a fixed-size string of characters, typically a unique value (hash), using a mathematical algorithm. This hash is used for verifying data integrity, securing communication, and enabling fast data retrieval, with each unique input producing a unique hash.

When planning sensor deployment, focus on critical assets and key process variables that directly impact production efficiency, quality or safety. Implementing a hierarchical sensor strategy (high-priority sensors collecting frequent data, lower-priority ones providing long-term insights) can help balance costs and data richness.

Data Storage

Here again you are faced with a decision between either local (edge) or a centralized cloud environment for data storage. The same the same pros and cons apply as did in data acquisition, but your needs may be different.

Edge storage is useful for real-time, low-latency processing, especially in critical operations where immediate decision-making is necessary. It also reduces the amount of data that needs to be transmitted to the cloud.

Cloud storage is scalable and ideal for long-term storage, cross-site access and aggregation of data from multiple locations. However, the bandwidth required for real-time data streaming to the cloud can be costly, especially in large-scale manufacturing operations.

Manufacturing environments typically generate large volumes of data due to high-frequency sensors. Plan for data compression and aggregation techniques at the edge to minimize storage overhead.

Lossless compression reduces data size without any loss of information, ideal for critical data. Popular algorithms include GZIP, effective for text data, LZ4, which is fast and low latency for real-time systems, and Zstandard (Zstd), offering high compression and quick decompression for IIoT.

Lossy compression, on the other hand, is suitable for sensor data where some precision loss is acceptable in exchange for better compression. Wavelet compression is efficient for time-series data, and JPEG/MJPEG is often used for images or video streams, reducing size while maintaining most visual information.

Data aggregation techniques help reduce data volume by combining or filtering information before transmission. Summarization involves averaging or finding min/max values over a time period. Sliding window aggregation and time bucketing group data into time intervals, reducing granularity. Event-driven aggregation sends data only when conditions are met, while threshold-based sampling and change-detection algorithms send data only when significant changes occur. Edge-based filtering and preprocessing ensure only relevant data is transmitted, and spatial and temporal aggregation combines data from multiple sources to reduce payload size.

Because edge devices often operate in resource-constrained environments, deal with real-time data and must efficiently manage the communication between local systems and central servers, there are several edge-specific considerations for optimizing data management in IIoT systems. For real-time applications, techniques like streaming compression (e.g., LZ4) and windowed aggregation help minimize latency by processing data locally. Delta encoding reduces data size by only transmitting changes from previous values, minimizing redundancy. Additionally, hierarchical aggregation allows data to be aggregated at intermediate nodes, such as gateways, before being sent to the central system, further reducing the transmission load and improving overall efficiency in multi-layered edge networks. These considerations are uniquely suited to edge computing because edge devices need to be efficient, autonomous, and responsive without relying heavily on central systems or expensive bandwidth.

You’ll also need a storage architecture that can scale to accommodate both current and future data growth. Also, implement a robust redundancy and backup strategy. With critical manufacturing data, losing information due to hardware failure or network issues can be costly. Redundant storage, preferably in different geographic locations (for disaster recovery), is crucial for resilience.

TIP: For time-sensitive data (e.g., real-time process control), store at the edge and use data batching for non-urgent data that can be transmitted to the cloud periodically, reducing latency and network costs.

Analytics

Real-time analytics is essential for immediate decision-making (shutting down a faulty machine or adjusting a process parameter), while historical analytics provides long-term insights into trends and performance (predictive maintenance, yield optimization).

To enable real-time analytics, data should undergo initial pre-processing and filtering at the edge, so that only relevant insights or alerts are passed to the cloud or central system. This reduces data transfer overhead and minimizes latency in decision-making. For long-term analysis (identifying trends, root cause analysis), use batch processing techniques to handle large datasets over time. Machine learning (ML) and AI models are increasingly integrated into IIoT systems to identify anomalies, predict failures or optimize operations based on historical data.

IIoT analytics is more than just looking at individual sensor data; it’s about correlating data across multiple devices, sensors and even different factory lines to uncover patterns. Implement data fusion techniques where data from different sensors or sources can be combined to improve the accuracy and richness of insights.

Visualization

Visualization tools are essential for both operators and decision-makers to quickly assess the performance of processes and machines. These should include customizable dashboards that display real-time Key performance indicators (KPIs) like throughput, efficiency, downtime and machine health. KPIs should be linked to the specific objectives of the manufacturing process.

For process optimization and long-term planning, historical trends and patterns should be visualized clearly. This allows for root-cause analysis, identifying inefficiencies and making data-driven decisions about process improvements.

These visualizations should be tailored to different user roles. Operators need real-time alerts and immediate insights into machine performance, while managers or engineers might need access to historical data and trend analysis. Design the user interface (UI) and access controls with these distinctions in mind.

For advanced implementations, digital twins and augmented reality can be used to simulate and visualize complex data in 3D. Digital twins create a virtual replica of the manufacturing environment, allowing engineers to monitor and optimize operations without needing to be physically present.

Planning IIoT implementations

When planning IIoT in manufacturing, focus on building a scalable, resilient and secure architecture for data collection, storage, analytics and visualization. Ensure that data collection is optimized to balance cost and data richness, using both edge and cloud storage appropriately. Analytics capabilities should provide real-time decision support while enabling deep insights through predictive maintenance and long-term performance analysis. Visualization tools should cater to different user needs, ensuring clear, actionable insights through both real-time dashboards and historical data views. Keep in mind the challenges of data volume, latency, network bandwidth and data integrity as you design the IIoT system, with attention to scalability and future-proofing the infrastructure to accommodate evolving manufacturing processes and t

The post How to plan data collection, storage and visualization in an IIoT deployment appeared first on Engineering.com.

]]>
BAE’s Falconworks R&D division aims to transform aerospace engineering https://www.engineering.com/baes-falconworks-rd-division-aims-to-transform-aerospace-engineering/ Wed, 16 Oct 2024 20:37:49 +0000 https://www.engineering.com/?p=132945 Siemens and BAE Systems partner in a massive digitalization effort in its aerospace manufacturing and engineering operations.

The post BAE’s Falconworks R&D division aims to transform aerospace engineering appeared first on Engineering.com.

]]>
Factory of the Future’ technologies involve integrating advanced digital tools like IoT, AI, and automation to create efficient, flexible, and intelligent manufacturing processes. (Image: BAE Systems)

At the Farnborough Airshow in July 2024, Siemens and BAE Systems announced a five-year collaboration to accelerate digital innovation in engineering and manufacturing. Using Siemens’ Xcelerator platform, this partnership seeks to transform processes within BAE Systems’ Air sector through FalconWorks, its Research and Development (R&D) division. The R&D center fosters an open innovation ecosystem, connecting suppliers, SMEs, governments, research organizations, and academia to “accelerate the innovation of future air power through the development of technology and capabilities.” It unites approximatively 2,000 experts across 11 sites in the UK.

This agreement builds on a longstanding relationship, deploying Siemens’ advanced digital software, such as NX and Teamcenter to enhance sustainability, industrial digitalization, and supply chain modernization. Leaders from both companies emphasized the collaboration’s potential to drive Industry 4.0 advancements and achieve significant digital transformation in aerospace manufacturing. Iain Minton, BAE Systems’ Technology Capability Delivery Director, noted, “Siemens understands the complexities of our operating environment, so we can very quickly mature an idea to the point where it is put into practice, for example when we are looking to implement and optimize new engineering, support, or manufacturing capabilities.”

A digital engineering ecosystem for open innovation

BAE Systems’ FalconWorks is not only looking at solving today’s challenges; “it is the agile innovation powerhouse driven by […] technology teams that will develop the game-changing technologies of the future.” Simply put, it focuses on scanning the technology horizon to identify and develop groundbreaking building blocks of the future in the Aerospace and Defense sector. Maintaining an edge in such competitive landscape implies developing industry standards, working with regulators to ensure these are acceptable to the society from a safety and sustainability perspective, while focusing on effective routes to market for successful commercialization.

Fostering an open innovation ecosystem, the company embarked on a multi-year strategic investment in Digital Engineering (DE) to digitalize its systems engineering and integration capabilities, “investing in digital infrastructure and virtual, collaborative Digital Engineering Capabilities Labs (DECL) to drive rapid innovation, state-of-the-art digital technologies, and cloud migration.” This includes collaboration with SMEs, academia, legislators, and industry leaders, along with co-funding start-ups to develop new technologies.

Per a 2020 whitepaper, BAE systems elaborated on its Advanced Integrated Data Environment for Agile Manufacturing, Integration and Sustainment Excellence (ADAMS) reference architecture to fulfil this vision: “this digital enterprise is built on a model-based, integrated development or data environment that supports multi-disciplinary, multi-organization stakeholders and leverages product-line reference architectures and a shared model library to develop, deliver, and sustain a system through its lifecycle.” Clearly, the digital ecosystem is only an enabler, part of a data layer foundational to drive process and product innovation.

Aerospace digital twins and data management

PLM serves as the backbone, integrating technologies, data, and processes to ensure seamless information flow across business functions and the entire product lifecycle—from concept and design to manufacturing, maintenance, and recycling. PLM processes require connected data flows across the manufacturing and extended enterprise ecosystem. Through integration and workflow automation, all product data, from design to production, must be digitized and interconnected, facilitating seamless communication between systems, machines, and teams. Such integration allows for real-time monitoring, data-driven decision-making, and automation, ensuring that the factory operates efficiently and can quickly adapt to changes in demand or production requirements.

Additionally, PLM supports continuous improvement by enabling feedback loops from the factory floor back to design and engineering, leading to optimized processes and product quality. For instance, this includes the implementation of advanced manufacturing techniques, such as additive manufacturing, 3D printing, and automated assembly, connecting CAD and software data with production processes by ensuring that all design and manufacturing data are centrally managed and accessible. In the context of BAE’s vision, PLM can facilitate the integration of Digital Twins, virtual representations to allow real-time monitoring and optimization of manufacturing processes—ensuring that the factory can respond dynamically to changes and demands. Aerospace Digital Twins are crucial for driving Industry 4.0 by enhancing efficiency, reducing costs, driving quality adherence, compliance, and sustainability. The top five Digital Twins essential for this purpose include:

  1. Product Digital Twins: Represent physical aircraft or components throughout their lifecycle, enabling real-time monitoring, predictive maintenance, and performance optimization to reduce downtime and extend asset lifespan.
  2. Process Digital Twins: Model and optimize manufacturing and assembly processes, allowing for quick identification of inefficiencies, waste reduction, and overall production quality improvement.
  3. Supply Chain Digital Twins: Provide a real-time, end-to-end view of the supply chain, managing disruptions, optimizing logistics, and ensuring timely delivery of components.
  4. Operational Digital Twins: Monitor in-service aircraft and systems, enabling optimization of flight paths, fuel consumption, and maintenance schedules for better performance and reduced costs.
  5. Human Digital Twins: Simulate interactions between humans and machines, optimizing human factors, enhancing training, and improving safety by modeling human responses to various scenarios.

Connected, sustainable asset optimization

A connected intelligent factory is a data-driven manufacturing environment that uses advanced automation, real-time analytics, and interconnected systems to optimize aerospace component production, assembly, and maintenance. The Aerospace industry strives to balance cutting edge innovations to foster competitive advantage with through-life optimization of complex assets to effectively capitalize long-lifecycle products. Asset compliance traceability and throughout monitoring is essential to enable Aerospace and Defense, and other heavy regulated operations, supporting new business models—from product development to full in-service operations management.

To that effect, BAE Systems’ Digital Intelligence division acquired Eurostep in 2023 to accelerate the development of its digital asset management suite, PropheSEA™, a platform to “consolidate and share […] complex asset data securely, allowing assets to be managed proactively, reducing operating costs and maximizing asset availability.” Mattias Johansson, Eurostep CEO, highlighted that “Eurostep has collaborated with BAE Systems for many years with […] ShareAspace sitting at the heart of Digital Intelligence’s Digital Asset Management product suite [to help organizations] securely collaborate across the supply chain and cost effectively manage their assets through life.” Regulators also require through-life carbon footprint measurement, which can be difficult to forecast with products whose asset life can span 40 to 50 years.

As presented in one of the ACE conferences championed by Aras in 2016, Kally Hagstrom, then Manager of Information Systems with BAE Systems, explained why complex long-lifecycle products require a PLM strategy that enables high-level of resiliency. BAE Systems then initiated the implementation of Aras Innovator alongside its legacy Teamcenter platform to consolidate several PLM business capabilities, from requirements to change management, systems engineering, supplier collaboration, process planning and MBOM management, document and project, as well as obsolescence management. Clearly, based on the recent Siemens partnership extension, the legacy Teamcenter environment is also there today at BAE Systems, regaining ground in the maintenance, repair and overhaul (MRO) space and/or expanding further into downstream manufacturing digitalization. Furthermore, it would be interesting to hear if/how BAE Systems is possibly driving the coexistence of multiple PLM platforms in its DE ecosystem to drive open innovation and manufacturing, possibly leveraging its 2023 investment in Eurostep.

To paint the full picture, it would be necessary to dig more into how BAE Systems collaborate with its supply chains and manage its intellectual property. This would also comprise a broader understanding of how the OEM connects the dots across its PLM, ERP and MES landscape to drive a truly end-to-end digital and data connected landscape. By enabling sustainable design and efficient resource management, integrated PLM can help reduce the environmental impact of aerospace manufacturing. This aligns with BAE’s broader goals of innovation and sustainability, ensuring that BAE Systems’ Factory of the Future is both technologically advanced and environmentally responsible.

The post BAE’s Falconworks R&D division aims to transform aerospace engineering appeared first on Engineering.com.

]]>
How digital transformation raises productivity https://www.engineering.com/how-digital-transformation-raises-productivity/ Wed, 16 Oct 2024 15:24:09 +0000 https://www.engineering.com/?p=132925 Exploring various ways engineers leverage digital transformation to boost productivity.

The post How digital transformation raises productivity appeared first on Engineering.com.

]]>

Digital transformation has become a vital force in reshaping industries. Adopting and integrating digital technologies into most business areas fundamentally changes how organizations operate and deliver customer value. As engineers worldwide navigate this evolution, one of the compelling benefits of digital transformation is its ability to raise productivity. Through advanced software, automation, data analytics, and enhanced connectivity, digital transformation enables businesses to operate more efficiently, innovate faster, and deliver better outcomes.

Below, we explore the various ways in which digital transformation boosts productivity.

Streamline business processes

Today, engineers spend too much time in low-productivity work such as:

  • Hunting for data.
  • Seeking access to data.
  • Cleaning data to a reasonable level of accuracy and completeness.
  • Integrating data using Excel.
  • Waiting for others to complete manual work.

Digital transformation delivers significant productivity gains when it includes re-engineering business processes by:

  • Simplifying the steps involved.
  • Designing steps to minimize the opportunities for errors.
  • Improving the availability of relevant digital data to those performing the process.
  • Integrating more efficient digital tools into the process.

Digital transformation often triggers the replacement of legacy systems with current software, such as enterprise resource planning (ERP) systems, customer relationship management (CRM) platforms, and industry-specific software-as-a-service (SaaS) solutions. Current software packages and SaaS solutions offer:

  • More comprehensive functionality.
  • Access to best practice processes.
  • Better alignment with various business functions.
  • Outsourced software maintenance.

For example, ERP systems consolidate many business processes—such as financial accounting, procurement, production management and supply chain management—into one seamless system, reducing the time spent on redundant tasks and increasing the speed at which businesses can operate and make decisions. Streamlined processes lead to improved staff productivity, fewer delays and higher accuracy.

Leading software to design and automate custom processes include Appian, Microsoft Power Automate, Outsystems, Pegasystems Pega, and Oracle BPM Suite. Leading ERP software vendors include Infor, Microsoft Dynamics 365, Oracle Netsuite, SAP S/4 HANA, and Workday.

Automate repetitive tasks

In many businesses, repetitive tasks consume significant time for engineers. The work is not rewarding and error-prone.

Digital transformation enables the automation of repetitive tasks to drive productivity and data quality. Businesses can automate routine and repetitive tasks, such as data entry, report generation, and many customer service interactions, through the use of artificial intelligence (AI), robotic process automation (RPA), and machine learning (ML).

For example, RPA can streamline back-office operations like finance and HR by processing transactions, aggregating data, and performing multi-step workflows without human intervention. This automation reduces the likelihood of errors and speeds up processes. It also frees up time for employees to focus on more strategic tasks that require critical thinking and creativity that directly contribute to business growth.

Leading software vendors for automating repetitive tasks include Automation Anywhere, Datamatics, SS&C Blue Prism and UiPath.

Support data-driven decision-making

Today, engineers and others make decisions in many businesses based on experience, partial data and hunches. That’s often riskier than it seems.

Digital transformation facilitates collecting and analyzing vast amounts of data, which organizations can leverage to make more informed decisions. Data analytics tools help engineers rapidly analyze performance patterns, customer behaviours, and market trends, assisting businesses to adjust strategies quickly and stay ahead of competitors.

Using data to drive decision-making processes enhances productivity by enabling more precise forecasting, leading to better production management, inventory management, and resource allocation. Data-driven decision-making processes are particularly valuable in industries like retail, manufacturing, and healthcare, where many minor improvements in efficiency can lead to significant cost savings and faster service delivery.

Leading software vendors for data-driven decision-making include Altair, Alteryx, Databricks, IBM Watson Studio, Oracle Analytics, SAP Analytics Cloud and Snowflake. Leading software vendors for data visualization include Google Data Studio, Microsoft Power BI, Minitab, Tableau, TIBCO and TrendMiner.

Enhance the customer experience

Organizations continue to experience difficulties delivering the customer service they aspire to.

A key goal of digital transformation is improving the customer experience in all channels, digital or otherwise. Tools such as AI-powered chatbots, self-service portals, and mobile applications allow businesses to serve customers more efficiently and responsively. Digital transformation enables engineers to interact with customers in real time, providing faster responses to inquiries, better product recommendations, and more personalized experiences.

Improved customer experiences can increase customer satisfaction, loyalty, and repeat business. When it’s possible to assign fewer employees to resolve common customer issues, productivity increases and costs decrease. Dedicating more employees to customer relationship building and innovation drives sales and profitability.

Leading software vendors for customer experience management include Birdeye, HubSpot, Microsoft Dynamics 365 Customer Insights, Podium, and Zendesk. Leading software vendors for call center operation include Five9, Nextiva, Nice CX-One, RingCentral and Talkdesk.

Digital transformation is not just a technological upgrade but a fundamental shift in how businesses operate and deliver value. It has proven to be a powerful driver of productivity, enabling businesses to streamline processes, automate mundane tasks, make data-driven decisions, enhance customer experience, and boost collaboration and communication.

Organizations that embrace digital transformation are better equipped to navigate the complexities of the modern marketplace, innovate faster, and remain competitive. As technology continues to evolve, the businesses that effectively harness the power of digital transformation will enjoy sustained productivity gains and long-term success.

The post How digital transformation raises productivity appeared first on Engineering.com.

]]>
Why digital transformation resonates more than PLM in the boardroom https://www.engineering.com/why-digital-transformation-resonates-more-than-plm-in-the-boardroom/ Thu, 10 Oct 2024 17:06:39 +0000 https://www.engineering.com/?p=132733 While PLM is often seen as a specialized discipline, digital transformation encompasses everything.

The post Why digital transformation resonates more than PLM in the boardroom appeared first on Engineering.com.

]]>

In simple terms, digital transformation is about using modern technology to make organizations run better and drive more value to their customers, employees and partners. It often encompasses improvements in decision-making, operations and customer service, and it opens opportunities to rethink and reshape how a business operates. Whether through Enterprise Resource Planning (ERP), Customer Relationship Management (CRM), Product Lifecycle Management (PLM), Manufacturing Execution Systems (MES), or advanced technologies like Artificial Intelligence (AI), Machine Learning (ML), and the Internet of Things (IoT), digital transformation helps redefine business operations for the future. However, the term is often used loosely, becoming synonymous with general technology adoption.

On the other hand, business transformation goes beyond simply implementing new tools. It involves structural changes, strategy overhauls, and redefined goals in response to evolving opportunities, challenges, or long-term visions. Digital tools may be central to enabling these shifts, but the key to successful transformation is in rethinking how the business operates.

To stay competitive, businesses need to undergo both digital and business transformations in tandem, as technological advancements often necessitate cultural, workflow, and strategic changes to fully unlock their potential.

Value drivers for digital transformation

Digital transformation rests on four foundational pillars that enable organizations to improve and innovate. These pillars include:

  1. Process and operating model transformation: Redesigning how work gets done across the organization to improve effectiveness, efficiency, agility, speed and value delivery.
  2. Technical debt improvement: Reducing reliance on outdated processes and systems that are expensive to maintain and limit the business’s ability to innovate quickly.
  3. Integrated ecosystem and data flow: Breaking down silos to ensure seamless data sharing across systems, departments, and possibly organizations—improving decision-making and operational alignment.
  4. Data analytics and insights: Leveraging data as a strategic asset for real-time decision-making, predictive analysis, and improved business outcomes.

These four pillars, when combined, create a holistic approach to digital transformation that looks beyond short-term gains. The focus should be on creating a long-term roadmap that acknowledges the complexity of transformation, ensuring the initiative is aligned with the business’s strategic objectives and future growth. It is essential to see digital as “transformational” as it not only about managing technical upgrades but reshaping the entire business ecosystem and ways of working.

Linking digital and business transformation

Despite the overwhelming focus on technology, from cloud to AI, people are the true heroes of successful transformations. For digital initiatives to work, business and technology subject matter experts need to be fully engaged, and open to adapting to new processes and tools. Without their buy-in, even the most advanced technologies or processes will fall short. Effective leadership and a strong change management approach are essential to foster the cultural shifts required for successful digital transformation.

Digital transformation is inherently linked to business transformation because the introduction of new technologies often necessitates changes in strategy, workflows, ways of working, and business models. This is not a one-time effort but an ongoing process that involves constant adaptation. As businesses evolve to remain competitive, digital tools provide the platform for that transformation, enabling greater flexibility, agility, and innovation.

The importance of cultural factors cannot be overstated. Digital transformation initiatives that fail to address the human element—training, collaboration, mindset shifts—often struggle. A culture of continuous learning and openness to change is crucial to realizing the full potential of any digital transformation effort.

Digitalizing the end-to-end product lifecycle

PLM, or whatever it is referred to in the organization, is not just about technology; it is a strategic approach that spans the entire enterprise. From initial concept and design to production, service, delisting, and circular economy, PLM ensures a seamless flow of information and processes throughout a product’s lifecycle. It powers collaboration, automation, and data-driven workflows that are essential to innovation. However, while PLM focuses primarily on managing product data and development processes, it complements other enterprise solutions like ERP, which handles broader financial, procurement, learning and development, and several downstream operational tasks.

Contrary to common misconceptions, PLM and ERP are not in opposition; they are complementary and often integrated. I am on the view that the wider PLM scope spans across ERP and into MRP/MES. Altogether, these disciplines and associated digital solutions enable organizations to manage both product-specific and operational data, creating a holistic view of the business. The scope of PLM has expanded, and many solution providers now refer to their offerings as part of digital transformation rather than PLM specifically. This reflects a broader strategic focus that includes upstream (idea and design phases) and downstream (customer service and product updates) functions, blurring traditional boundaries between departments and roles.

The broader appeal of digital transformation

At the executive level, the term “digital transformation” resonates far more than PLM because it encompasses a broader range of business functions and opportunities, including:

  • Broader business impact across customer experience, operations, and innovation—whereas PLM is often seen as limited to product development and management.
  • Strategic alignment with long-term business goals like growth, agility, and competitiveness, making it more relatable to board-level discussions focused on overall business strategy.
  • Customer-centric focus: While PLM is product-centric, digital transformation places greater emphasis on improving customer experience, which is a top priority for board members looking to drive revenue and loyalty.
  • Cross-departmental relevance as digital impacts all parts of the organization, from finance and marketing to supply chain and HR, making it a more comprehensive initiative that engages the entire leadership team; PLM often remains primarily associated with R&D functions.
  • Future-proofing the business by adopting emerging technologies like AI, IoT, and data analytics that create new opportunities for innovation.
  • Easier to communicate and measure at the executive level—such as improved operational efficiency, cost savings, and better customer satisfaction—compared to the more specialized and technical benefits of PLM.

While PLM is often seen as a specialized (engineering rooted) discipline, digital transformation covers everything from improving customer experience to streamlining operations and driving innovation. In many cases, PLM’s associated technical complexities make it harder for board members to grasp its full value, whereas digital transformation offers a more expansive narrative that aligns with the company’s broader goals. Key questions about PLM ownership at board level remain prominent in many organizations and across industries, from the Chief Technology Officer (CTO) to the Chief Data (or Digital) Officer (CDO), and across with other management board executives.

Vendors and analysts have capitalized on this shift, with all enterprise platform providers positioning their offerings under the banner of digital transformation. This shift often emphasizes transitioning from legacy systems to new technologies like digital threads, digital twins, and cloud platforms. Once again, true transformation goes beyond technology—it comes from the adoption of these tools by the business and the realization of their value in day-to-day operations.

Digital transformation goes beyond the adoption of new tools; it is about enabling and reshaping the entire business. In this journey, PLM plays a critical role, but its contribution is part of a larger, more interconnected strategy that involves every aspect of the enterprise. Ultimately, for businesses to thrive, they need to embrace the full scope of transformation—leveraging technology, people, and processes together to create lasting value.

The post Why digital transformation resonates more than PLM in the boardroom appeared first on Engineering.com.

]]>