Industry 4.0 framework is leading to a revolution in
manufacturing processes in many strategic industrial sectors, such as
aerospace, automotive and energy, demonstrating advantage in strong
customization of the products, increasing component intelligence and rapid data
transfer. Industry 4.0 technologies can provide a new momentum in the design
and operation of more and more efficient and reliable combustion systems. The
burner prototypes will be the result of the traditional design criteria integrated
with the digital Industry 4.0 technologies: Additive Manufacturing, Integrated Sensors
and IoT devices for Edge Computing, Simulation, Data Analytics and Augmented Reality.
A Digital Twin of the physical Burner 4.0 will
be the bridge between Industry 4.0 technologies and the actual burners
technology.
ioT
Simulation
Additive
Manufacturing
Integrated
Sensors
Digital
Twin
Data
Analytics
Augmented
Reality
Edge
Computing
Additive Manufacturing
Additive Manufacturing (AM) of metallic parts makes possible
to directly fabricate net or near-net shape components without the need for any
tooling or machining operations.
Different AM systems for the
realization of metal products, based on casting technologies by means of laser,
electron beam or plasma, are available on the market today, among which:
Integrated sensors for combustion processes
Advanced, robust and plant embeddable sensors for combustion
processes real time diagnostics are key elements in the data collection chain
required by innovative burner operation control and optimization strategies
aimed at increasing its efficiency while minimizing pollutants emission. Accordingly, they are also
essential for burner Digital Twin development and operation, as well as for IoT concept integration, in new and
existing plants.
Different
families of sensors useful for combustion processes can be identified according
to their underlying physical principle. They have different, somewhat
complementary but worth of interest diagnostic potentials in terms of
measurable quantities, sensitivity, accuracy and robustness. Synthetically, sensors can be
classified as:
The synergistic coupling of optical diagnostics with
complementary diagnostics based on chemical sampling and analysis is often
considered for experimental characterization of simple combustion processes. These techniques are
complementary in terms of the detection of species present in the control
volume, the optical diagnostics allowing for detection of radical species
whereas the chemical sampling enabling for the measurement of stable species giving the opportunity to
characterize the combustion progress on local and integral scale, which is very
relevant especially in case of advanced combustion technologies, such as
flameless combustion. The
contemporary detection of amplitude and spectral nature of acoustic emissions
can be used to gain better insight into the process and design of more
effective control strategies.
Simulation
The evolution of simulation modelling paradigm stepped through the stage of
individual application and utilization of specialised simulation such as CFD
for solving complex fluid dynamics problems, to the simulation-based System
Design approach of multi-level and -disciplinary systems, and finally to the
Digital Twin Concept where simulation is a core functionality of systems by
means of seamless assistance along the entire life cycle, e.g. supporting
operation and service with direct linkage to the operation data. It should be
remarked that the early stages of the simulation modelling paradigm (e.g. CFD
or FEM Analysis or numerical optimization methods) still remain a fundamental
pillar of the modern engineering but, in an Industry 4.0 perspective, are not
the end-point.
ioT
Recent advances in the Internet of Things (IoT) paradigm
pose a severe challenge in terms of efficient data handling. Cloud computing
represents a possible answer, by exploiting massive data centralization in
order to leverage artificial intelligence techniques and big data analytics.
While this approach seems perfectly feasible for low frequency data and
statistical investigations, it seems less suitable when real-time monitoring or
high frequency data streams need to be processed.
Edge computing
Edge
computing pushes applications,
data and computing power away from centralized points to the logical extremes
of a network. For industrial automation, this means processing the data where
they are acquired, i.e., on the field. The significant decrease of data volume
that must be moved, and hence the associated reduction of transmission costs
and latency, suggests that the use of edge computing approaches can be particularly
useful for signal processing activities when pervasive sensor networks are
employed, thus allowing to limit capital expenditures and simplifying revamping
of existing systems with new generation intelligent components.
Augmented reality
Due to the Industry 4.0 initiative, Augmented Reality (AR) has started to be considered one
of the most interesting technologies companies should invest in, especially to
improve their maintenance services. Several technological limitations have
prevented AR to become an effective industrial tool in the past. Now some of
them have been overcome, some others not yet by off-the-shelf technologies.
Data analytics
A fundamental part is dedicated to the
data collection from relevant sensors and measurement devices: the huge
quantity of information coming from plant and process must be pre-processed by
means of suitable filtering techniques to consider only consistent and reliable
data sets that Data Analytics Algorithms will process to detect component behaviour degradation. To reach this target,
according to industrial use cases, it is necessary to determine the system
requirements and identify reliable Key Performance Indicators (KPI) to
characterize process quality and efficiency. Although frequently considered the
same, Knowledge Discovery and Data mining (KDD) and Machine Learning are
different branches of what today is being called Data Science. Both areas have
similar goals in terms of providing the developers a way to model different
systems but differ on the information that need to be fed to the algorithms.
Moreover, several algorithms are used in both science areas although its application
will be different. KDD technologies have a great advantage on the discovery of
unknown characteristics of the supplied datasets, and thus it is frequently
defined as “the non-trivial extraction of implicit, previously unknown, and
potentially useful information from data.” In contrast, Machine Learning
provides other advantages and focuses on prediction, based on known properties
that are provided in the training data. In the Burner 4.0 project, both KDD and
ML algorithms will be evaluated so as to provide the basis for the definitions
of rules for Predictive Maintenance and for the quality improvement and
optimization of process management based on defined KPIs.
Digital twin
In most definitions, the DT is considered as
a virtual representation that interacts with the physical object throughout its
lifecycle and provides intelligence for evaluation, optimization, prediction,
etc. They focus on both the physical and virtual sides as well as the
connection, which are the essential elements in the DT three-dimension
framework. The three parts are: (1) physical entity in physical space, (2)
virtual entity in virtual space, (3) a connection of data and information that
ties the physical and virtual entities together.
Since big
data analytics are available for dealing with large and diverse data sets,
valuable information can be mined efficiently from the data. Based on this, the data
can be considered as a driver that provides intelligence to make the
constructed DT operate continuously. Data of the DT come from both the physical
and virtual spaces, such as product lifecycle data from physical entities,
simulated data from digital models, operation data from information systems,
and related knowledge. They can comprehensively drive the operations of the DT.
For example, in the DT, digital model construction can be driven by rules and
constraints mined from data of entities, decisions in the related information
systems can be driven by simulated data from the digital models, and operations
of the entities can be driven by the pre defined orders and plans from the
models and systems. Without the data, the DT cannot start working, let alone
provide further analysis and optimization. With real-time data being generated
continuously, more valuable information will be accumulated for the DT.
created with
Website Builder Software .