Michael Ouellette, Author at Engineering.com https://www.engineering.com/author/michael-ouellette/ Fri, 25 Oct 2024 10:37:28 +0000 en-US hourly 1 https://wordpress.org/?v=6.6.2 https://www.engineering.com/wp-content/uploads/2024/06/0-Square-Icon-White-on-Purplea-150x150.png Michael Ouellette, Author at Engineering.com https://www.engineering.com/author/michael-ouellette/ 32 32 Applying AI in manufacturing: Q&A with Jon Hirschtick https://www.engineering.com/applying-ai-in-manufacturing-qa-with-jon-hirschtick/ Thu, 24 Oct 2024 17:11:24 +0000 https://www.engineering.com/?p=133231 Onshape CEO and CAD legend Jon Hirschtick talks about his approach to AI and how manufacturers can extract value.

The post Applying AI in manufacturing: Q&A with Jon Hirschtick appeared first on Engineering.com.

]]>
Onshape CEO Jon Hirschtick. (Image: PTC)

The general sentiment around the usefulness of artificial intelligence (AI) has seen it’s ups and downs over the last couple of years. After bursting into public consciousness in late 2022, the hype has subsided. As we enter the final stretch of 2024, the current thinking is that AI is in bubble territory and companies should be wary of putting too much stock into its potential.

This is sound advice, no doubt. But as with any burgeoning tech, the early value is often found in the margins, helping companies gain an edge rather than bringing groundbreaking change. This holds true with AI—it won’t change society as we know it, at least not yet. But when applied to the right niche it could have significant impact.

The U.S. manufacturing sector, which was estimated to be worth about $2.5 trillion in 2022 by the National Association of Manufacturers, is one such niche that could reap significant rewards from a thoughtful and intentional approach to investing in AI.

First off, AI in manufacturing isn’t new, it just goes by a different name—machine learning. Secondly, much of the AI-based functionality being developed for manufacturing is done by major companies with well-funded R&D divisions and a strong foothold in the manufacturing market.

Indeed, virtually all of the major design software firms are finding ways to incorporate generative AI and large language models (LLMs) into their products. One such example is Boston-based software developer PTC’s cloud CAD system Onshape.

We caught up with Onshape’s CEO Jon Hirschtick to talk about its entry into the AI playground, how AI can bring value to manufacturers and what he sees in the near future for the still nascent technology. This interview was edited for clarity and conciseness.

Eng.com: How is PTC approaching AI technology?

Jon Hirschtick (JH): PTC is doing a lot with AI. We shipped some AI-based functions in our products and have a bunch of exciting AI-based projects under development, such as Onshape AI Advisor. We’re also active in the research community, where PTC employees are involved in research papers that are public. Our tools are being used by the AI community, perhaps even more than any other tools in our industry, for AI research. That’s another exciting dimension. There’s a ton of work and applications happening, and I’m proud that PTC isn’t getting overly hyped about AI, pushing things out before they’re ready. We’re focused and taking a solid approach.

Eng.com: So, at IMTS, you announced the Onshape AI Advisor, which is essentially an AI CAD assistant. Can you tell me why AI is a good fit for this application?

JH: With the Onshape AI Advisor, we’re using AI to provide expert user advice for fairly complex questions about how to use Onshape. These are the kinds of questions users typically ask another user or contact support or technical services. Instead of that, they can type a plain English question, even one that’s sophisticated, and get an answer about how to use a particular technique. AI is really good at this kind of task. It’s not rocket science anymore—it’s clearly within the wheelhouse of AI. This is very valuable for our users. We have deep professional CAD and PDM capabilities, and this is another layer of assistance.

Eng.com: Just to clarify the Onshape AI Advisor—is it a Q&A tool for users, or are you moving toward AI generating designs or doing the grunt work?

JH: It’s more than just an FAQ. It’s a conversational tool—you can ask questions, and the AI helps you with specific tasks, not just pre-set answers. As for generating designs, we’re working on it, but it’s not quite there yet. We’ve got great demos and research, but generating robust 3D geometry for manufacturing is still too complex. In entertainment, sure, you can generate digital assets, but for industrial applications, it’s not ready yet. There are too many safety concerns and error possibilities right now.

Eng.com: How long ago did you identify AI as something that needed to be developed for your product, your company and your users?

JH: It depends on how you define AI. If you define AI as machine learning, I’d say it goes back more than five years, when we started using machine learning to understand customer satisfaction through behavior. As for generative AI, almost from the moment it became known in research, we were involved in research papers, not necessarily knowing whether it would turn into real products. We had many people in the AI research community saying Onshape is the perfect platform for AI research because we’re the only cloud-native CAD and PDM system. Everything’s available through a REST API, so you don’t have to install a lot of software for large-scale usage, and with the right license agreement, you can access our public data library. Some people want to train on that 15 million human-created CAD and PDM documents. So, the research interest and commercial application started almost from day one. Over the last year or two, we’ve realized we could actually start shipping some of these things.

Eng.com: Absolutely. Your trajectory here matches many others who have integrated AI into their products. Can you talk a bit about the importance of further developing this and creating a baseline of AI capabilities to build on?

JH: I think it’s important that we explore what AI can do, and start shipping these products to customers because it’s new tech. I think AI is critical. I think our users must feel like product developers did when plastics or carbon fiber came along. It’s not just a better way of doing things; it’s a whole new set of tools that make you redefine problems. It allows you to approach problems differently. And so, the baseline is important not only for study but for releasing products. Just like with the first plastic product, you can’t know what it’s really like until you use it. We need to build reps, understand how to deliver and leverage the cloud-native solutions of Onshape.

Eng.com: The way you describe the Onshape AI Advisor brings to mind the concept of tribal knowledge within an organization. For years, we’ve talked about how companies lose access to knowledge due to retirement or attrition. This seems like a repository for that knowledge. It learns what people are doing and stores it for future users, correct?

JH: Yes, but it’s only a partial step. The AI assistant can give access to knowledge that a user might have taken with them when leaving the organization, so it helps there. But we’re far from tackling the whole problem. In the future, we might be able to develop a model specific to a company’s use of Onshape. Right now, the AI assistant only looks at general Onshape user knowledge—it doesn’t look at your specific data. But eventually, we could create a model around your company’s practices and provide insights like, “Here’s how experts in your company apply this technique.” So, it’s a partial step in that direction, but there’s a lot more we can do. We’re starting with Onshape AI, and there are other PTC products, like ServiceMax, which are also capturing expert knowledge.

Eng.com: It sounds like what you’re describing is essentially custom AI agents for a customer. Is that something on the horizon?

JH: Possibly. We’re not announcing anything yet, but if you ask me to speculate, I’d say it could definitely be part of our vision. PTC has a real leg up here, given our cloud-native infrastructure and the highly secure services we operate. We’ve been doing this for a while with Onshape and our PDM systems. So yes, we could imagine something like a custom AI agent for a company, where the AI looks at the best users in the company and helps new users align with those best practices. That’s something we could do in the future, based on the data we collect. It could be something like “What’s the best way to apply this technique in our company?” AI could inform decisions based on data, helping save time and improving efficiency.

Eng.com: AI agents are slowly becoming a thing for companies, but you need sensors and other devises to collect data for the agent. Is there a way around that that?

JH: With Onshape, we don’t need to go through that manual process of collecting data. Our system captures every single action as a transaction—if you drill a hole, undo it, or modify a feature, that’s all tracked. We have more data than any other system about a user’s activity, so we don’t need to go out and collect data manually. This gives us a huge advantage in training AI applications. In the future, users might even be able to combine data from their channels, emails, and other sources, and create a composite picture of what’s happening. We’re working on ways to give more value without relying on the kind of manual collection you mentioned.

Eng.com: Another PTC product named Kepware is a software layer that collects and aggregates shop floor data. Is this something you’re planning on incorporating into your AI products in the future?

JH: I can’t announce anything specific at the moment, but Kepware is well-positioned for this and there’s definitely potential there. Kepware handles sensor data, and we handle digital data from our systems. Combining those sources could be very powerful. The point is that PTC is uniquely positioned with both Onshape and Kepware, which span the digital thread spectrum. We’re also working with ServiceMax, which collects a lot of field service data, and they’re looking into AI for capturing service expertise as well.

Eng.com: Moving on to AI and digital transformation—what have you learned about AI’s application in manufacturing while attending IMTS 2024?

JH: At IMTS, I learned that we’re still in the early days of AI in product development and manufacturing. The promise is huge, but most organizations aren’t using a lot of AI yet. That being said, the number of projects people are working on is incredible. Everyone’s still figuring out which use cases work best, balancing doability, usability and value. I saw companies using AI in some exciting ways, like summarizing manufacturing data or configuring industrial products, but the technology isn’t fully ready for more complex tasks yet. I think the next few years will be about figuring out what works and refining those applications.

Eng.com: Do you think smaller companies are adopting AI faster than larger ones?

JH: Absolutely. The smaller companies tend to be more agile, and AI doesn’t necessarily require a huge investment in hardware or infrastructure. With Onshape and the AI assistant, you don’t need big machines or complicated installations. Smaller companies can jump in and take advantage of the latest tools without needing millions in capital. They’re moving faster in many cases, though not always.

The post Applying AI in manufacturing: Q&A with Jon Hirschtick appeared first on Engineering.com.

]]>
How to plan data collection, storage and visualization in an IIoT deployment https://www.engineering.com/how-to-plan-data-collection-storage-and-visualization-in-an-iiot-deployment/ Mon, 21 Oct 2024 19:32:23 +0000 https://www.engineering.com/?p=133070 Be sure to consider scalability and future-proofing to accommodate evolving manufacturing processes and technologies.

The post How to plan data collection, storage and visualization in an IIoT deployment appeared first on Engineering.com.

]]>

When it comes to an IIoT (Industrial Internet of Things) implementation in manufacturing, data collection, storage, analytics and visualization are the core backplane that drives actionable insights and enables smarter operations.

How do these components typically align in an IIoT system and what considerations should a manufacturing engineer should keep in mind when planning an implementation? It can certainly get complicated, but breaking things down into their smaller parts makes it more manageable.

Data Collection

The effectiveness of data collection largely depends on sensor architecture. Depending on the equipment or process, various types of sensors (temperature, pressure, vibration, etc.) need to be deployed across critical points in the manufacturing process. Ensure sensors are selected with appropriate accuracy, environmental tolerance and response time for the specific application.

A Data Acquisition Systems (DAS) act as an interface between these sensors and the IIoT platform. It gathers real-time data from sensors and transmits it to the edge or cloud infrastructure. The big decision here is whether to use edge processing (local data pre-processing) or rely on centralized data gathering at the cloud level. Edge processing offers lower latency, making it ideal for real-time tasks. It also reduces bandwidth needs by processing data locally. However, it requires more upfront investment in hardware and can be harder to scale. In contrast, cloud processing handles large data volumes more easily and scales better, though it comes with higher latency and ongoing costs for bandwidth and storage. Cloud systems also need robust security measures for data transmission. A hybrid approach combining both edge and cloud processing might be an option that balances real-time processing with scalable, centralized data management, but it depends on each application and the desired outcomes.

The next big decision is to determine the optimal sampling rate. Too high of a sampling frequency can overwhelm your storage and bandwidth, while too low may miss critical insights, particularly in dynamic manufacturing processes. Work with process engineers to determine the data sampling frequency based on process variability. Test this often to ensure what you think is the optimal sampling rate isn’t leaking potential value.

If you are going to base major decision off the insights gained through this IIoT system, you must ensure the integrity of collected data. This means that error checking (e.g., using checksums or hashing) and redundancy mechanisms (e.g., backup data paths or local buffering) are in place to handle network failures or sensor malfunctions.

A checksum is a small-sized piece of data derived from a larger set of data, typically used to verify the integrity of that data. It acts as a digital fingerprint, created by applying a mathematical algorithm to the original data. When the data is transmitted or stored, the checksum is recalculated at the destination and compared with the original checksum to ensure that the data has not been altered, corrupted or tampered with during transmission or storage.

Hashing is the process of converting input data into a fixed-size string of characters, typically a unique value (hash), using a mathematical algorithm. This hash is used for verifying data integrity, securing communication, and enabling fast data retrieval, with each unique input producing a unique hash.

When planning sensor deployment, focus on critical assets and key process variables that directly impact production efficiency, quality or safety. Implementing a hierarchical sensor strategy (high-priority sensors collecting frequent data, lower-priority ones providing long-term insights) can help balance costs and data richness.

Data Storage

Here again you are faced with a decision between either local (edge) or a centralized cloud environment for data storage. The same the same pros and cons apply as did in data acquisition, but your needs may be different.

Edge storage is useful for real-time, low-latency processing, especially in critical operations where immediate decision-making is necessary. It also reduces the amount of data that needs to be transmitted to the cloud.

Cloud storage is scalable and ideal for long-term storage, cross-site access and aggregation of data from multiple locations. However, the bandwidth required for real-time data streaming to the cloud can be costly, especially in large-scale manufacturing operations.

Manufacturing environments typically generate large volumes of data due to high-frequency sensors. Plan for data compression and aggregation techniques at the edge to minimize storage overhead.

Lossless compression reduces data size without any loss of information, ideal for critical data. Popular algorithms include GZIP, effective for text data, LZ4, which is fast and low latency for real-time systems, and Zstandard (Zstd), offering high compression and quick decompression for IIoT.

Lossy compression, on the other hand, is suitable for sensor data where some precision loss is acceptable in exchange for better compression. Wavelet compression is efficient for time-series data, and JPEG/MJPEG is often used for images or video streams, reducing size while maintaining most visual information.

Data aggregation techniques help reduce data volume by combining or filtering information before transmission. Summarization involves averaging or finding min/max values over a time period. Sliding window aggregation and time bucketing group data into time intervals, reducing granularity. Event-driven aggregation sends data only when conditions are met, while threshold-based sampling and change-detection algorithms send data only when significant changes occur. Edge-based filtering and preprocessing ensure only relevant data is transmitted, and spatial and temporal aggregation combines data from multiple sources to reduce payload size.

Because edge devices often operate in resource-constrained environments, deal with real-time data and must efficiently manage the communication between local systems and central servers, there are several edge-specific considerations for optimizing data management in IIoT systems. For real-time applications, techniques like streaming compression (e.g., LZ4) and windowed aggregation help minimize latency by processing data locally. Delta encoding reduces data size by only transmitting changes from previous values, minimizing redundancy. Additionally, hierarchical aggregation allows data to be aggregated at intermediate nodes, such as gateways, before being sent to the central system, further reducing the transmission load and improving overall efficiency in multi-layered edge networks. These considerations are uniquely suited to edge computing because edge devices need to be efficient, autonomous, and responsive without relying heavily on central systems or expensive bandwidth.

You’ll also need a storage architecture that can scale to accommodate both current and future data growth. Also, implement a robust redundancy and backup strategy. With critical manufacturing data, losing information due to hardware failure or network issues can be costly. Redundant storage, preferably in different geographic locations (for disaster recovery), is crucial for resilience.

TIP: For time-sensitive data (e.g., real-time process control), store at the edge and use data batching for non-urgent data that can be transmitted to the cloud periodically, reducing latency and network costs.

Analytics

Real-time analytics is essential for immediate decision-making (shutting down a faulty machine or adjusting a process parameter), while historical analytics provides long-term insights into trends and performance (predictive maintenance, yield optimization).

To enable real-time analytics, data should undergo initial pre-processing and filtering at the edge, so that only relevant insights or alerts are passed to the cloud or central system. This reduces data transfer overhead and minimizes latency in decision-making. For long-term analysis (identifying trends, root cause analysis), use batch processing techniques to handle large datasets over time. Machine learning (ML) and AI models are increasingly integrated into IIoT systems to identify anomalies, predict failures or optimize operations based on historical data.

IIoT analytics is more than just looking at individual sensor data; it’s about correlating data across multiple devices, sensors and even different factory lines to uncover patterns. Implement data fusion techniques where data from different sensors or sources can be combined to improve the accuracy and richness of insights.

Visualization

Visualization tools are essential for both operators and decision-makers to quickly assess the performance of processes and machines. These should include customizable dashboards that display real-time Key performance indicators (KPIs) like throughput, efficiency, downtime and machine health. KPIs should be linked to the specific objectives of the manufacturing process.

For process optimization and long-term planning, historical trends and patterns should be visualized clearly. This allows for root-cause analysis, identifying inefficiencies and making data-driven decisions about process improvements.

These visualizations should be tailored to different user roles. Operators need real-time alerts and immediate insights into machine performance, while managers or engineers might need access to historical data and trend analysis. Design the user interface (UI) and access controls with these distinctions in mind.

For advanced implementations, digital twins and augmented reality can be used to simulate and visualize complex data in 3D. Digital twins create a virtual replica of the manufacturing environment, allowing engineers to monitor and optimize operations without needing to be physically present.

Planning IIoT implementations

When planning IIoT in manufacturing, focus on building a scalable, resilient and secure architecture for data collection, storage, analytics and visualization. Ensure that data collection is optimized to balance cost and data richness, using both edge and cloud storage appropriately. Analytics capabilities should provide real-time decision support while enabling deep insights through predictive maintenance and long-term performance analysis. Visualization tools should cater to different user needs, ensuring clear, actionable insights through both real-time dashboards and historical data views. Keep in mind the challenges of data volume, latency, network bandwidth and data integrity as you design the IIoT system, with attention to scalability and future-proofing the infrastructure to accommodate evolving manufacturing processes and t

The post How to plan data collection, storage and visualization in an IIoT deployment appeared first on Engineering.com.

]]>
What are the roles of sensors and actuators in IIoT? https://www.engineering.com/what-are-the-roles-of-sensors-and-actuators-in-iiot/ Mon, 07 Oct 2024 19:48:25 +0000 https://www.engineering.com/?p=132533 Sensors are the eyes and ears of your operation and actuators are the hands.

The post What are the roles of sensors and actuators in IIoT? appeared first on Engineering.com.

]]>

Every manufacturing engineer considering an IIoT implementation should put considerable focus into how the systems contribute to data collection, real-time decision-making and automated control within the production environment.

Sensors are the eyes and ears of your operation. These data collection devices continuously monitor various physical or environmental parameters on the shop floor. Sensors have been developed to measure almost any condition on the shop floor. Here are some common types:

Temperature (for controlling furnaces or ovens)

Pressure (for monitoring hydraulic or pneumatic systems)

Vibration (for detecting imbalance in motors or machinery)

Humidity (for ensuring optimal conditions in certain manufacturing processes)

Proximity (for part detection on a conveyor belt or pallet)

Torque and Force (for ensuring precise assembly or machining)

These days, most sensors provide real-time data that are essential for understanding the status of machines, the health of equipment and the quality of products.

Sensors can capture data continuously or at regular intervals, feeding it back to a centralized system or edge devices. This data allows you to monitor machine performance and production quality in real-time. By continuously monitoring conditions such as temperature, vibration and pressure, sensors can help predict equipment failures before they happen—enabling predictive maintenance strategies. This minimizes downtime and unplanned repairs. Sensors can also ensure product quality by tracking parameters such as size, weight or chemical composition, ensuring products are within acceptable tolerances.

The data collected by sensors is sent to centralized cloud systems or edge devices for real-time analysis, enabling manufacturers to make informed decisions on production adjustments and process improvements.

Actuators: The Hands of Your IIoT System

Once sensors collect and transmit data, actuators play the critical role of executing actions based on the data received. Actuators are devices that respond to control signals by performing physical tasks, including:

Opening or closing a valve (to control fluid or gas flow in a pipeline)

Adjusting motor speeds (for conveyor belts or robotic arms)

Turning machines on or off (for automated start/stop of equipment)

Controlling temperature (by activating heating or cooling systems)

Moving robotic arms or equipment (for assembly, material handling or other precision tasks)

In an IIoT system, actuators are responsible for automating responses to specific conditions detected by sensors. This creates the foundation for closed-loop control systems that can operate independently of human intervention. For example, if a temperature sensor detects overheating, the actuator could activate a cooling system without manual intervention. This automation reduces human labor and the chances of errors or inefficiencies in production. It also speeds up response times to deviations, minimizing waste and downtime.

Actuators can also adjust machine settings dynamically. For example, based on real-time data, they can modify the speed or pressure of a machine, ensuring the production process adapts to the changing needs of the workflow.

In more advanced IIoT setups, edge computing and AI-driven algorithms use sensor data to make autonomous decisions, triggering actuators without human oversight. This could be as simple as adjusting a process or as complex as rerouting products based on real-time data streams.

Working together in IIoT

In a typical IIoT system, the interaction between sensors and actuators follows a continuous cycle of data collection and response, which is often referred to as closed-loop control. Here’s an example:

Sensors detect changes: A temperature sensor detects that the temperature in a furnace is rising above the set threshold.

Data is sent: The sensor transmits this information to the controller (either an edge device or cloud platform) in real-time.

Data is analyzed: The controller analyzes the data and determines that corrective action is needed (e.g., the furnace is overheating).

Actuator takes action: Based on the analysis, the controller sends a signal to an actuator that opens a valve to release cooling air or turns on a cooling system.

Process adjustment: The actuator performs the task, and the sensor continues to monitor the process, feeding back data to ensure the temperature returns to safe levels.

Benefits of sensors and actuators in manufacturing

Increased Production Efficiency:

Sensors and actuators enable real-time adjustments to processes, ensuring that machines operate within optimal parameters. This minimizes downtime and keeps production flowing smoothly.

Enhanced Predictive Maintenance:

Continuous data from sensors allows for early detection of wear and tear or impending failures, reducing the need for reactive maintenance and minimizing unexpected breakdowns. Actuators can automatically adjust processes to prevent equipment damage.

Improved Quality Control:

Sensors track key quality metrics, and actuators can adjust the process instantly to ensure product quality remains consistent, reducing waste and scrap.

Operational Flexibility:

Sensors and actuators provide greater control over manufacturing systems, enabling them to respond flexibly to changes in production schedules, environmental factors, or even supply chain disruptions.

Cost Reduction:

Automation through sensors and actuators can lower labor costs and reduce human error. Moreover, optimized processes lead to less material waste, contributing to overall cost savings.

Data-Driven Decision Making:

By integrating sensors and actuators with a central data system (cloud or edge-based), manufacturers can leverage real-time analytics to gain actionable insights and make informed decisions to improve efficiency and productivity.

Common challenges

Let’s face it, maintaining a network of sensors and actuators and similar technology in a manufacturing environment can be tricky. Many environmental and workflow factors can result in degraded performance, even if they aren’t integrated into a broader IIoT implementation.

However, in IIoT manufacturing systems, several challenges are directly related to the integration of sensors and actuators into the broader industrial network. One key issue is communication latency and bandwidth limitations. IIoT systems rely heavily on real-time data transfer between sensors, actuators and control systems. Latency or insufficient bandwidth can delay data transmission or actuator responses, which is particularly troublesome in time-sensitive applications where quick reactions are essential.

Another challenge is connectivity and reliability issues. Since IIoT systems often involve wireless communication (e.g., Wi-Fi, LPWAN, or other IoT protocols), connectivity problems like signal dropouts, weak coverage or protocol incompatibility can disrupt the flow of critical data. In a networked environment, these disruptions can lead to missed sensor readings or commands not reaching actuators, causing downtime or unsafe conditions.

The sheer volume of data generated by IIoT devices can also lead to data overload and management challenges. With sensors constantly transmitting data, storage and processing systems can quickly become overwhelmed, making it difficult to extract actionable insights or react quickly to system needs. This can hinder operational efficiency, slow decision-making, and complicate data analysis.

Security vulnerabilities are another significant concern in IIoT systems. As sensors and actuators become more interconnected, they are exposed to potential cyber threats. Hackers could access the network to manipulate sensor data or control actuators, posing serious risks to both data integrity and physical safety.

Lastly, sensor and actuator compatibility can be an issue when integrating devices from different manufacturers or upgrading legacy systems. IIoT environments require seamless communication between different components, and incompatible sensors, actuators or communication protocols can lead to integration problems, system inefficiencies or even failures in real-time operations.

To address these challenges, best practices include using real-time networking protocols, implementing strong cybersecurity measures, employing edge computing to process data closer to the source, and ensuring that systems are compatible and interoperable across the IIoT network. These steps help ensure that the IIoT infrastructure operates reliably and efficiently.

The post What are the roles of sensors and actuators in IIoT? appeared first on Engineering.com.

]]>
What are the connectivity considerations in an IIoT implementation? https://www.engineering.com/what-are-the-connectivity-considerations-in-an-iiot-implementation/ Fri, 04 Oct 2024 15:18:53 +0000 https://www.engineering.com/?p=132475 Connectivity is the foundation of any Industrial Internet of Things (IIoT) implementation. For engineers, it’s not just about ensuring that devices and systems can talk to each other; it’s about choosing the right network architecture, protocols and security strategies to meet operational goals. In IIoT, connectivity refers to the ability of machines, sensors and control […]

The post What are the connectivity considerations in an IIoT implementation? appeared first on Engineering.com.

]]>

Connectivity is the foundation of any Industrial Internet of Things (IIoT) implementation. For engineers, it’s not just about ensuring that devices and systems can talk to each other; it’s about choosing the right network architecture, protocols and security strategies to meet operational goals.

In IIoT, connectivity refers to the ability of machines, sensors and control systems to communicate over networks. This enables real-time data exchange and interaction between devices, local networks, edge systems and centralized cloud platforms. In IIoT implementations, this connectivity is critical to enabling the flow of data needed for process optimization, predictive maintenance, remote monitoring and real-time decision-making.

IIoT devices can range from sensors to actuators to industrial machines. For devices to exchange data directly, you’ll typically use machine-to-machine (M2M) protocols. Engineers must ensure that these devices can communicate over low-latency and robust protocols that handle the real-time data flows characteristic of industrial environments.

Protocols like Modbus, OPC UA, and MQTT are industry standards used in IIoT for device-to-device communication. While Modbus, OPC UA, and MQTT are indeed the cornerstones of IIoT protocols, there are many other protocols to choose from depending on the application, environment and system requirements. Each protocol comes with its own set of strengths and weaknesses, so it’s important to assess performance, security, scalability and interoperability when selecting a protocol for your IIoT architecture.

Another consideration is protocol overhead, which is the extra information that communication protocols add to manage data transmission, handle security, ensure data integrity and support real-time operation. While necessary for reliable, secure communication, overhead can reduce bandwidth efficiency, increase latency and consume more power, which is especially problematic in IIoT environments. Understanding and managing protocol overhead is essential for optimizing performance and efficiency in IIoT implementations.

Edge connectivity

Edge devices (often called edge gateways or edge controllers) act as intermediaries between the industrial devices and the cloud. They handle preprocessing and data aggregation before sending relevant information upstream.

Implementing edge computing reduces latency, conserves bandwidth and allows for real-time decision-making at the device level. Edge architecture must be scalable and secure, often integrating with local databases or edge AI algorithms to run complex analytics.

Cloud connectivity and platform integration

IIoT relies heavily on cloud-based platforms for long-term data storage, aggregation, advanced analytics and remote monitoring. Cloud platforms offer scalable environments for handling data streams from devices in the field.

Ensuring reliable connectivity between edge nodes and the cloud is vital. Engineers should also focus on data integrity and network reliability, optimizing data protocols to reduce packet loss and latency.

Common protocols and data handling

MQTT is lightweight, supports real-time data and works well in low-bandwidth environments, making it ideal for IIoT where data volumes can be massive but not all data needs to be sent in real-time.

OPC UA is widely used in industrial settings for real-time data exchange between PLCs and other industrial automation equipment. It also supports security, which is a critical concern in industrial systems.

RESTful APIs or HTTP/HTTPS are more suitable for web-based interfaces or when integrating IIoT with existing enterprise IT systems but may not offer the real-time capabilities needed for certain mission-critical operations.

How to Address Connectivity Challenges

Industrial environments can be challenging for connectivity due to electromagnetic interference, harsh environments and network congestion. Implement redundant networks (dual Ethernet, cellular backup) for failover in case of primary network failures. Mesh networking in IIoT can increase reliability in environments with intermittent connectivity.

Engineers will often deal with scaling from dozens to thousands of devices over a large geographical area. To support this, it’s important to architect networks that can grow without compromising performance. This may involve local edge computing to handle localized data aggregation and minimize bandwidth requirements.

Security is paramount in IIoT, especially when sensitive operational data and critical infrastructure are involved. Use end-to-end encryption (TLS, AES) and secure communication protocols (like OPC UA with security features enabled). Additionally, ensuring device authentication, role-based access control and network segmentation can help protect against cyber threats.

Zero-trust architectures are becoming increasingly popular in industrial networks to ensure that no device or user is implicitly trusted.

Latency and bandwidth optimization

Low latency is crucial for time-sensitive operations, such as real-time control or automated responses in manufacturing. For example, 5G and LPWAN (Low Power Wide Area Networks, such as LoRaWAN) are being explored for IIoT because they offer low latency, high bandwidth and long-range communication capabilities.

You should also look at how data is being transmitted. Use data compression, aggregation and edge processing to reduce the volume of data being sent over the network.

Technologies enhancing IIoT connectivity

With the advent of 5G, IIoT is gaining a huge advantage in terms of bandwidth and low latency. 5G allows for high-density device support and real-time communication, ideal for applications like autonomous vehicles, smart grids and advanced robotics in factories.

For environments where power efficiency is crucial and devices are spread across large areas, such as farms, pipelines or smart cities, LPWAN protocols offer extended range and low power consumption with relatively low bandwidth needs.

Edge computing reduces the need to send every bit of data to the cloud, providing a more efficient means of processing high volumes of data locally. This can include real-time anomaly detection or local decision-making that reduces latency and bandwidth needs.

Best practices for IIoT implementation

In industrial settings, systems and machines from multiple manufacturers may need to communicate with each other. Ensure your connectivity infrastructure allows for interoperability through open standards (like OPC UA) and modular architectures that can easily integrate with third-party equipment.

Track all data flows and network performance with network monitoring tools and data governance frameworks. This will help in troubleshooting, performance tuning and meeting compliance standards.

Architect your IIoT system in a modular way so new devices or protocols can be integrated without requiring a full system redesign. This modularity supports future-proofing the system as new technologies emerge.

For engineers implementing IIoT, connectivity is a multi-faceted challenge that involves choosing the right protocols, designing reliable and secure networks, optimizing for scalability and latency and ensuring devices can communicate efficiently across systems. The foundation for a successful IIoT implementation lies in robust, scalable and secure connectivity, enabling real-time data flow, remote monitoring and proactive decision-making.

The post What are the connectivity considerations in an IIoT implementation? appeared first on Engineering.com.

]]>
What are the key aspects of IIoT? https://www.engineering.com/what-are-the-key-aspects-of-iiot/ Wed, 04 Sep 2024 18:14:25 +0000 https://www.engineering.com/?p=131490 The industrial Internet of Things is playing a pivotal role in shaping the future of manufacturing. Here we explore what it is and how it all started.

The post What are the key aspects of IIoT? appeared first on Engineering.com.

]]>

At its core, the industrial Internet of Things (IIoT) is about infusing traditional industrial environments with advanced digital technology. Sensors and smart devices are embedded in machinery to continuously collect data on everything from temperature to vibration. These sensors actively monitor and report on the performance and condition of equipment in real-time.

This data may seem like just a flood of numbers, but with the right mindset it’s a treasure trove of potentially actionable insights. In a well thought out implementation, advanced analytics and machine learning algorithms sift through this data, uncovering patterns and trends that were previously hidden. This means that rather than waiting for a machine to break down, manufacturers can now predict when a failure might occur and address it before any disruption. This proactive approach helps to reduce unexpected downtime and extend the lifespan of equipment.

The connectivity that IIoT brings means managers can adjust processes on the fly, optimize resource use and even automate many aspects of production. This level of automation boosts efficiency and enhances productivity, allowing for more streamlined operations and higher output.

Cost savings are another significant benefit of IIoT. By minimizing unplanned maintenance and optimizing energy consumption, manufacturers can reduce their operational expenses. Predictive maintenance, for example, ensures that equipment is serviced only when needed, rather than on a fixed schedule or after a failure.

Moreover, IIoT introduces a new level of flexibility into manufacturing. Factories equipped with IIoT technology can adapt more easily to changes in designs, demand or shifts in production requirements. Ideally, manufacturers can quickly reconfigure their operations or scale them up or down based on real-time needs, making them more responsive to market fluctuations.

Safety and regulatory compliance are also enhanced through IIoT. The continuous monitoring of equipment helps identify potential hazards before they become serious issues, creating a safer working environment. Additionally, accurate data collection supports compliance reporting with safety standards and regulations.

Additionally, wider supply chains benefit from the integration of IIoT. With better tracking and management capabilities, manufacturers can improve logistics decisions and inventory management, ensuring that materials and products move seamlessly through the supply chain.

In essence, IIoT has the potential to transform traditional manufacturing into a dynamic, data-driven environment. It’s turning factories into smart, connected ecosystems where every machine and process is in constant communication, leading to smarter decisions, greater efficiency, and a more agile and responsive production environment.

Evolution of IIoT: PLCs set the stage

The roots of IIoT can be traced back to the mid-20th century when electronic controls and automation began to take shape. The introduction of programmable logic controllers (PLCs) in the 1960s marked a significant leap forward, allowing machines to be controlled with greater precision and flexibility.

The 1980s and 1990s saw the integration of computer technology into industrial environments. The advent of personal computers and advancements in software led to the development of more sophisticated control systems. Manufacturing Execution Systems (MES) emerged, providing real-time data on production processes and improving operational efficiency. However, these systems were often isolated and lacked the connectivity seen in modern systems.

Enter IoT

The concept of the Internet of Things (IoT) began to take shape in the early 2000s, thanks to the proliferation of internet connectivity and sensor technology. Kevin Ashton coined the term “Internet of Things” in 1999 while working at Procter & Gamble, envisioning a future where everyday objects could communicate over the internet. This concept initially focused on consumer applications but laid the groundwork for what would become IIoT.

The early 2010s marked the formal emergence of IIoT as a distinct concept. As broadband internet and wireless technologies matured, Machine builders began to integrate internet connectivity into industrial machinery and processes. The introduction of smart sensors, which collected and transmitted data about various operational parameters, was a game-changer. These sensors, coupled with advances in cloud computing and big data analytics, enabled real-time monitoring and analysis of industrial processes on an unprecedented scale.

Advancements and adoption

By the mid-2010s, IIoT had gained substantial traction across various sectors. The integration of advanced analytics and machine learning allowed for deeper insights and predictive capabilities. Industries from manufacturing to energy and transportation embraced IIoT to enhance efficiency, reduce downtime and optimize operations. The development of edge computing, which processes data closer to the source rather than relying solely on centralized cloud servers, further accelerated IIoT adoption by reducing latency and improving responsiveness.

Current state and future developments

Today, IIoT is a cornerstone of Industry 4.0, the fourth industrial revolution characterized by digital transformation and smart technologies. Modern IIoT systems leverage a combination of sophisticated sensors, advanced analytics and interoperable platforms to create highly efficient and adaptive industrial environments. Innovations such as digital twins—virtual replicas of physical systems—allow for simulation and optimization of industrial processes in real-time.

Looking ahead, the evolution of IIoT continues with advancements in AI and 5G connectivity, which promises even faster data transmission and more nuanced, automated analysis. As industries strive for greater automation, efficiency and sustainability, IIoT is expected to play an increasingly pivotal role in shaping the future of manufacturing and beyond.

The post What are the key aspects of IIoT? appeared first on Engineering.com.

]]>
To get value from AI, size matters—but not the way you think https://www.engineering.com/to-get-value-from-ai-size-matters-but-not-the-way-you-think/ Tue, 27 Aug 2024 14:23:07 +0000 https://www.engineering.com/?p=131160 Large companies can make huge gains from a solid AI implementation, but smaller firms can make an impact faster.

The post To get value from AI, size matters—but not the way you think appeared first on Engineering.com.

]]>
(Image: Allie Systems)

This scene plays out every day across American manufacturing: in a sprawling factory, the hum of machinery produces a metallic symphony that can only be created by the tools of advanced industrial progress. Yet, amid the whir of automated systems, state-of-the-art robots and small autonomous vehicles gliding across the floor to their intended destination, a surprising scene unfolds: a worker stationed in the middle of the shop floor diligently recording real time production data on a chalkboard.

This paradox—a high-tech factory still reliant on humanity’s oldest method of data recording—illustrates a broader issue within the manufacturing sector. Despite substantial investments in advanced software, machinery and automation, many factories lag in digitalization, missing out on the efficiencies that digital transformation and the latest AI technology are poised to deliver.

Alex Sandoval, founder and CEO of Allie Systems, a manufacturing AI developer based in Mexico, has witnessed this scenario numerous times. His company specializes in developing autonomous manufacturing AI agents and aims to redefine how enterprise-scale manufacturers interact with the latest technology. The mission is to bridge the gap between high-tech machinery and the outdated data practices still in use, even in many factories that would rightfully be considered smart manufacturing facilities.

Decoding manufacturing AI

Allie’s AI approach is to inject another dose of intelligence into that already smart manufacturing environment.

“Manufacturing involves numerous machines and computers generating vast amounts of data,” Sandoval explains. “Our role is to tap into that data, which is often underutilized, and turn it into actionable insights.”

Allie AI connects to the various machines within a production facility, gathering data on production quality, process variables and machine health. This data is then used to train an AI agent—essentially a highly specialized software system that becomes an expert in the specific operations of a factory. This AI agent can predict and identify issues, suggest improvements and even take action autonomously.

The challenge of digitalization

Despite the allure of advanced technology, the manufacturing sector faces significant hurdles. Sandoval highlights the irony of modern factories with cutting-edge robotics still relying on outdated methods of data collection. “I visited a top beverage manufacturer, and while their production line was fully automated, they had workers manually recording data,” he says. “It’s a stark reminder that automation does not automatically equal digitalization.”

This disconnect between sophisticated equipment and data management systems creates inefficiencies. The promise of AI in manufacturing lies in its ability to analyze vast amounts of data and provide real-time solutions, but many companies have yet to implement the type of cohesive digital strategy required to effectively train an AI agent.

Big versus small

Allie AI’s clients are predominantly large, enterprise-scale companies across the Americas. These are giants in industries like food and beverage, where the cost of inefficiency can be astronomical. Sandoval’s company has 35 full-time employees and an additional 40 implementation engineers who visit factories to physically install and connect the data gathering systems. This on-the-ground approach is crucial for understanding and optimizing complex manufacturing processes.

However, smaller companies also stand to benefit from AI. According to Sandoval, mid-sized firms have the advantage of agility while potentially valuable projects could lie dormant for months at large enterprises. “While large companies can leverage more data, they often face bureaucratic delays,” he explains. “Smaller firms can move quickly, implementing solutions with fewer obstacles and adapting faster.”

Hype or reality

The promise of AI has faced it’s fair share of skepticism, with some arguing that it has yet to deliver significant value. Sandoval counters this view by pointing to the enormous potential for AI in manufacturing. “Globally, the manufacturing sector wastes trillions of dollars due to inefficiencies,” he notes. “AI can address these issues, but only if applied to the right problems.”

He highlights a real-world example from a cement company in Mexico, where downtime of a single oven can cost $550,000 per hour. By using AI to predict and prevent such issues, companies can achieve substantial cost savings and operational improvements.

It’s unlikely the benefits for smaller organizations will carry a similar dollar value. But the speed of implementation combined with a flatter organizational structure means the benefits, though smaller, will likely be realized much faster and with just as great an impact.

Looking ahead, Sandoval joins the chorus of voices that envision a future where AI becomes integral to manufacturing. “We’re moving towards a time where factories will be managed by AI agents that handle the bulk of operational decisions,” he says. “These agents will analyze data, suggest optimizations and even implement changes autonomously.” Sandoval also acknowledges the human element in this technological shift. The success of AI in manufacturing depends on upskilling the workforce and overcoming resistance to new technologies. “Training and adapting are crucial,” Sandoval emphasizes. “As AI evolves, so must our approach to managing and integrating it into our systems.”

The post To get value from AI, size matters—but not the way you think appeared first on Engineering.com.

]]>
Generative design for aerospace engineering https://www.engineering.com/generative-design-for-aerospace-engineering/ Thu, 01 Aug 2024 18:35:33 +0000 https://www.engineering.com/?p=52644 Exploring the benefits, challenges and future potential of generative design in aerospace.

The post Generative design for aerospace engineering appeared first on Engineering.com.

]]>
A CAD model of the OPUSAT-II satellite after unfurling its deployment mechanism. (Image: NASA)

Generative design represents a transformative approach to engineering and manufacturing, particularly for aerospace manufacturers, which face stringent performance, weight and efficiency requirements.

This method leverages advanced computational algorithms to explore and generate optimized designs based on specified parameters and constraints. In aerospace, generative design offers significant advantages by producing innovative, lightweight and highly efficient parts tailored to meet specific needs such as aerodynamics, structural integrity and fuel efficiency.

Principles of generative design

Generative design harnesses the power of algorithms and computational simulations to explore a vast range of potential design configurations at a speed no human would be capable of matching.

Unlike traditional design methods that rely heavily on human intuition and experience, generative design starts with defining design goals, constraints and performance criteria. These could include factors such as load requirements, material properties, manufacturing constraints, and operational conditions.

Generative design algorithms explore numerous design iterations automatically, considering various combinations of shapes, materials and structural arrangements. Many fundamental algorithms and theoretical frameworks behind generative design are developed in universities and research institutions where academic researchers focus on the mathematical and computational aspects of design algorithms. Companies specializing in design and engineering software also create and implement generative design algorithms within their tools. These companies employ teams of engineers and data scientists who develop and refine the algorithms for practical applications based on advances in computational technology and design theory.

Generative design often integrates with advanced simulation tools such as finite element analysis (FEA) and computational fluid dynamics (CFD) to evaluate and validate the performance of generated designs under real-world conditions. Another common technique in generative design is topology optimization, which minimizes material usage while maximizing structural performance by iteratively removing material from less critical areas and reinforcing areas subject to higher stress loads.

By exploring unconventional shapes and configurations that might not be intuitively obvious to human designers, generative design can uncover solutions that optimize weight, strength and performance.

Benefits of generative design in aerospace

Generative design offers several compelling benefits critical to aerospace applications. Here are a few:

Weight Reduction and Optimization: Generative design can produce lightweight structures by removing excess material while maintaining or enhancing structural integrity.

Performance Enhancement: Aerospace components designed using generative techniques can be optimized for specific performance criteria such as aerodynamics, thermal management, and acoustic properties.

Complex Geometry Handling: Aerospace parts often require complex geometries to meet functional and aerodynamic requirements. Generative design excels in handling such complexities by creating organic and efficient shapes that traditional manufacturing methods may find challenging to produce.

Manufacturability and Cost Efficiency: While generative designs can be highly complex, advancements in additive manufacturing (AM) technologies facilitate the production of intricate geometries with minimal waste. This enhances manufacturability and reduces material costs over traditional machining methods.

Innovation and Rapid Iteration: Generative design fosters innovation by allowing engineers to explore a broader design space and quickly iterate through potential solutions. This agility is crucial in adapting to evolving aerospace requirements and technological advancements.

Challenges and considerations

Despite its transformative potential, generative design in aerospace faces several challenges. The computational resources required for generative design can be substantial, particularly for complex aerospace components. High-performance computing infrastructure and efficient algorithms are essential to manage this complexity effectively, and neither come cheap. It’s not just required for the design phase of any product—validating generatively designed parts for performance under various operating conditions is crucial but is also resource-intensive.

Even with the required computational horsepower, integrating generative design into existing engineering workflows and CAD systems isn’t easy. Ensuring compatibility, data interoperability and seamless transition from design to manufacturing are critical considerations. Many software companies are developing solutions to this compatibility

Optimal designs generated by algorithms may require materials with specific properties that traditional materials may not readily offer. Identifying suitable materials and qualifying them for aerospace use are ongoing challenges and exotic materials are always more expensive.

While generative design automates much of the design exploration process, human expertise remains essential for interpreting results, refining designs based on engineering judgment and ensuring the feasibility of manufacturing and operational requirements. This requires highly skilled engineers with a strong understanding of how generative design works and how to use it properly.

Future implications of generative design

The future of generative design in aerospace boasts of some exciting developments. Integrating artificial intelligence (AI) and machine learning (ML) with generative design could enable systems to learn from past designs, predict performance outcomes and suggest novel solutions that continuously improve over time without human intervention.

As computing power and simulation capabilities continue to advance, generative design will expand its scope to explore more complex and multidisciplinary design spaces. Connecting generative design with digital twin technologies will produce real-time monitoring and optimization of aerospace components throughout their lifecycle, from design and manufacturing to operation and maintenance. All of this data will be shared via cloud-based platforms and collaboration tools to facilitate global teamwork and knowledge sharing among engineers, accelerating innovation cycles and enabling faster deployment of optimized designs.

The post Generative design for aerospace engineering appeared first on Engineering.com.

]]>
The 5 layers of digital transformation https://www.engineering.com/the-5-layers-of-digital-transformation/ Fri, 19 Jul 2024 16:58:33 +0000 https://www.engineering.com/?p=52440 How to think about digital integration and transformation within a company or process.

The post The 5 layers of digital transformation appeared first on Engineering.com.

]]>

Embarking on digital transformation for an aerospace manufacturing company signifies a strategic shift towards integrating advanced digital technologies across all facets of operations.

This includes using technologies such as Industrial Internet of Things (IIoT) for real-time monitoring of equipment and systems, implementing artificial intelligence (AI) and machine learning algorithms for predictive maintenance and optimized production scheduling and adopting digital twins to simulate and optimize the performance of aircraft components and systems.

The digitalization pyramid

The digitalization pyramid is a conceptual framework used in industrial and organizational contexts to illustrate the levels of digital integration and transformation within a company or process.

It consists of several layers or stages, each representing different aspects of digitalization. While variations exist, a common representation includes the following layers:

Data collection: The base layer of the pyramid involves the collection of raw data from various sources within the organization or across the value chain. This data can come from sensors, machines, devices, databases or virtually any system that collects data.

Data integration: The next layer is about integrating and consolidating the collected data into a unified format or system. This stage ensures that data from different sources can be accessed, processed and analyzed.

Data analysis: You guessed it. This layer is about analyzing the integrated data to derive insights, trends, patterns and actionable information. Techniques such as statistical analysis, machine learning and artificial intelligence are a natural fit here.

Digitalization: This layer involves the transformation of business processes and operations using digital technologies and insights gained from data analysis. It includes automation, optimization and the use of digital tools to streamline workflows and improve efficiency.

Digital transformation: This last phase is the goal of the entire exercise and represents the strategic adoption of digital technologies to fundamentally change how a business operates, delivers value to customers and competes in the market. It may involve new business models, innovative products or services and a shift towards a more data-driven and agile organization.

This is a basic roadmap for organizations looking to evolve and harness the power of digital technologies, but nothing about this process is basic. Each one of these phases is made up of many complicated initiatives and no company can do this properly without good partners in the process.

What’s the difference between digitization and digitalization?

The terms “digitization” and “digitalization” are related but have distinct meanings in the context of technology and business transformation:

Digitization refers to the process of converting information or data from analog to digital form. It involves transforming physical or analog artifacts (such as documents, images, videos or processes) into digital formats that can be digested, stored, and transmitted electronically. Examples include scanning paper documents to create digital copies, converting analog audio or video recordings into digital formats or creating digital records of interactions between machines.

Digitalization is the broader process of integrating digital technologies into various aspects of business operations, processes and strategies to fundamentally change how they operate and deliver value to customers. It relies on digital technologies (like AI, IoT, cloud computing, data analytics) to improve efficiency, create new business models, enhance customer experiences and innovate within an organization. Some examples would be implementing IoT sensors to gather real-time data for predictive maintenance, using AI algorithms to automate decision-making processes, adopting cloud-based solutions for scalable operations or redesigning customer interactions through digital channels.

The post The 5 layers of digital transformation appeared first on Engineering.com.

]]>
Additive manufacturing’s role in digital manufacturing for aerospace https://www.engineering.com/additive-manufacturings-role-in-digital-manufacturing-for-aerospace/ Tue, 16 Jul 2024 21:22:55 +0000 https://www.engineering.com/?p=52401 Discussing the basics of how this production-scale technology is used to manufacture aerospace parts out of both plastic and metal.

The post Additive manufacturing’s role in digital manufacturing for aerospace appeared first on Engineering.com.

]]>
The inner workings of a 3D printer. (Image: Mantle 3D)

Additive manufacturing (AM), often referred to as 3D printing, has developed into a transformative technology in digital manufacturing and is especially applicable in the aerospace sector. Gone are the days when this technology was reserved for producing physical models during the prototyping phase of product development. It has now become a production-scale technology used to manufacture parts out of both plastic and metal.

Complex design

What makes additive manufacturing such an important piece of the digital manufacturing puzzle is its ability to produce highly complex geometries—including moving parts— that are impossible to achieve using traditional subtractive technologies such as machining, joining and forming.

This is a perfect fit for the aerospace sector, where components often benefit from intricate internal structures, lightweight designs and optimized shapes that enhance performance and efficiency. This “lightweighting” is crucial in aerospace to improve fuel efficiency and increase payload capacity.

AM allows engineers to design and fabricate parts with optimized geometries and material distributions, reducing overall weight while maintaining or even improving structural integrity and performance.  It supports a wide range of materials, including metals, polymers, composites and ceramics, which aerospace manufacturers select for specific properties, such as high strength-to-weight ratios, heat resistance and conductivity, which are tailored to meet requirements of each component.

Integration with smart manufacturing

Unlike its traditional manufacturing counterparts, which are built on fundamentals developed more than 200 years ago, additive manufacturing is a relatively recent technology which was developed in tandem with the digital technology boom of the past two decades. Because of this, it integrates nicely with technologies that make up Industry 4.0 and smart manufacturing. This fosters a more interconnected, efficient and data-driven manufacturing environment.

AM supports agile manufacturing by enabling on-demand production of customized parts and reducing lead times compared to traditional manufacturing methods. Smart manufacturing systems can dynamically adjust production schedules and allocate resources based on real-time demand signals and production data.

Additive also minimizes material waste by only using the necessary amount of material required for each part, contributing to sustainability goals by optimizing energy usage and operational efficiency, reducing environmental impact and operational costs.

AM reduces dependency on traditional supply chains by enabling on-demand production of parts and components. It supports customization and rapid prototyping, allowing for iterative design improvements and faster development cycles.

Repair and Maintenance

AM is increasingly used for repair and maintenance operations in aerospace because it allows for the manufacturing of replacement parts on-site or on-demand, reducing downtime and logistics costs associated with spare parts inventory.

Although it has taken a long time, additive manufacturing now represents the paradigm shift industry was always told it could be. It offers unprecedented design flexibility, efficiency gains and performance enhancements. As technology continues to advance, additive manufacturing is poised to play an even more significant role in shaping the future of aerospace production and innovation.

Key terminology for additive manufacturing

STL file: A file format used in additive manufacturing to represent surface geometry of a 3D object. It stands for Stereolithography.

Slicing: The process of dividing a digital 3D model into thin horizontal layers for the additive manufacturing process.

Build plate: The surface upon which objects are 3D printed or built during additive manufacturing.

Fused deposition modeling (FDM): An additive manufacturing technology where a thermoplastic filament is extruded layer by layer to build objects.

Stereolithography (SLA): An additive manufacturing process that uses a UV laser to solidify layers of liquid resin to build 3D objects.

Powder bed fusion: A category of additive manufacturing technologies where layers of powdered material are selectively fused together using a heat source (e.g., laser or electron beam).

Support structures: Temporary structures added to 3D models during printing to support overhanging features and ensure successful printing.

Build volume: The maximum size of an object that can be printed within a particular 3D printer.

Post-processing: Additional steps taken after printing to improve the appearance or functionality of the printed object (e.g., cleaning, curing, polishing).

Selective laser sintering (SLS): An additive manufacturing process that uses a high-powered laser to selectively fuse powdered materials (such as plastics, metals, or ceramics) into a solid 3D structure.

Electron beam melting (EBM): An additive manufacturing process where a high-energy electron beam selectively melts and fuses metal powder particles to build up a 3D object layer by layer.

Direct metal laser sintering (DMLS): A variation of SLS specifically for metal powders, where a laser fuses metal powder particles together to create a metal part.

Binder jetting: An additive manufacturing process where a liquid binding agent is selectively deposited onto powder bed layers to bind them together, forming a solid object.

Material extrusion: A category of additive manufacturing processes where material is selectively deposited through a nozzle or orifice. Includes FDM and similar technologies. Infill: The internal structure of a 3D printed object, which can vary in density and pattern to achieve different mechanical properties and reduce material usage.

The post Additive manufacturing’s role in digital manufacturing for aerospace appeared first on Engineering.com.

]]>
Advanced design tools for digital manufacturing in aerospace https://www.engineering.com/advanced-design-tools-for-digital-manufacturing-in-aerospace/ Mon, 15 Jul 2024 18:51:00 +0000 https://www.engineering.com/?p=52299 CAD and CAE software are two advanced design tools that have become indispensable in aerospace manufacturing.

The post Advanced design tools for digital manufacturing in aerospace appeared first on Engineering.com.

]]>
A Lockheed Martin technician looks at the connector installation on the CAD model of the X-59 airplane. (Image: NASA)

In aerospace manufacturing, several advanced design tools play crucial roles in ensuring efficiency, safety, and performance.

Computer-Aided Design (CAD) software has played a crucial role in aerospace manufacturing for many years. It has evolved to enable more than just design, growing to empower simulation and optimization of components and systems.

CAD software allows engineers to create detailed 2D drawings and 3D models of aircraft components, such as wings, fuselage, landing gear and interiors. Engineers can visualize designs from different angles, zoom in for detailed views and rotate components to examine them thoroughly for performance, weight reduction and manufacturability.

Parametric modeling in CAD enables engineers to define the dimensions, relationships, and constraints of each part within a design. Changes made to one part automatically update related components and assemblies, ensuring consistency and reducing errors.

CAD software also facilitates the assembly of multiple components into a complete assembly and even a full aircraft or spacecraft. Engineers simulate the fitting of parts, check for interferences and ensure that all components align correctly within the assembly.

CAD systems facilitate collaboration among multidisciplinary teams, allowing engineers, designers and analysts to work together on a single digital platform. The software generates detailed documentation, including engineering drawings, bill of materials (BOM) and manufacturing instructions.

Models developed in CAD software serve as a basis for computer-aided manufacturing (CAM) processes, where they are used to generate toolpaths for machining, additive manufacturing or composite layup. Simulation of toolpaths ensure the machinability of manufactured parts.

The CAD software integrates with product lifecycle management (PLM) systems to manage the entire lifecycle of aerospace products, from initial concept through design, manufacturing, operations, maintenance and eventual retirement.

CAD software is indispensable in aerospace manufacturing for its ability to streamline design processes, improve accuracy, optimize performance, and support collaborative efforts across the entire product lifecycle. Its integration with other advanced tools and technologies further enhances its utility in creating safe, efficient and innovative aerospace solutions.

Computer-aided engineering

Where CAD is primarily focused on creating detailed 2D and 3D models of components and assemblies, Computer-aided engineering (CAE) software is used for simulation, analysis and optimization of designs to evaluate their performance under various conditions such as stress, heat, fluid dynamics, and vibrations. CAE software helps engineers predict how products will behave in real-world scenarios, aiding in design validation and performance enhancement. Here are the key details of how CAE software is used in this industry:

Finite element analysis

CAE software allows engineers to simulate how aircraft structures will behave under different loading conditions, such as aerodynamic forces, landing impacts, and vibrations during flight. Engineers define material properties, boundary conditions and loads in the software. Finite element analysis (FEA) calculates stresses, strains, deformations, and factor of safety to ensure that structural components meet safety and performance requirements. It helps optimize designs by identifying areas of high stress or deformation that may need reinforcement or redesign.

Aerodynamics

Computational Fluid Dynamics (CFD) simulates the flow of air around aircraft surfaces and through internal systems. CFD predicts aerodynamic forces, such as lift and  drag,  which are crucial for optimizing aircraft performance and fuel efficiency. It allows engineers to evaluate different wing designs, control surface configurations, and engine placements to achieve desired flight characteristics.

Thermal analysis

CAE tools simulate thermal behavior within aircraft components and systems. They predict temperatures across surfaces and within structures under varying environmental conditions, such as high-altitude flight or engine operation. Thermal analysis ensures that components remain within safe operating temperatures and allow heat to be managed to prevent thermal stresses and failures.

Multi-physics simulations

Some CAE software platforms integrate multiple physics simulations, such as coupling structural, thermal, and fluid dynamics analyses. This capability is essential for evaluating complex interactions between different physical phenomena within aerospace systems, such as heat transfer in engine components or structural deformation under fluid pressure.

Vibration and acoustics

CAE software models and predicts vibrations and acoustics of aircraft structures and within cabins. Engineers use these simulations to reduce noise levels, ensure passenger comfort, and prevent structural fatigue caused by resonance or excessive vibration.

Impact and crashworthiness:

CAE tools simulate impact scenarios, such as bird strikes or emergency landings, to assess crashworthiness and occupant safety. They predict the behavior of aircraft structures and materials under sudden loads to optimize designs for survivability and structural integrity.

Optimization and design iteration:

CAE software facilitates design optimization by analyzing multiple design iterations quickly and efficiently. Engineers can evaluate different materials, geometries, and configurations to achieve optimal performance, weight savings, and cost-effectiveness.

Certification and compliance:

CAE simulations contribute to the certification process by providing data and analyses that demonstrate compliance with regulatory requirements and safety standards. They help validate designs before physical testing, reducing time and costs associated with certification efforts.

CAE software enables aerospace engineers to simulate, analyze, and optimize complex systems and components throughout the design and development process. By providing insights into structural integrity, aerodynamic performance, thermal behavior, and more, CAE tools support the creation of safe, efficient, and innovative aerospace solutions. Structural analysis, computational fluid dynamics and thermal analysis are essential in aerospace to ensure components meet performance requirements and safety standards.

The post Advanced design tools for digital manufacturing in aerospace appeared first on Engineering.com.

]]>