## Abstract

The application of augmented reality (AR) to maintenance issues has resulted in significant improvements in reducing the time operators spend finding and comprehending manual maintenance procedures. One area that requires innovation is reducing the rigidity of procedures within AR-guided maintenance applications. Current widely applicable strategies are limited in that they can only be completed off-site or they can be completed on-site but rely on operator knowledge or expert intervention in order to perform reconfiguration. In this work, a novel framework is presented to allow for automated reconfiguring of procedures within AR-guided maintenance applications. Once triggered, the presented framework is able to work autonomously. The framework relies on subassemblies of the machine being maintained and analyzes the effect a defective part has within its subassembly. This information is used to create a modified procedure using automatic procedure creation methods. An implementation of the framework is presented using a simple example. The framework is utilized in a complete AR-guided maintenance application and test.

## 1 Introduction

Maintenance reduces the amount of downtime or machine malfunctions and is imperative in running operations and equipment effectively. Similar to annual vehicle inspections, maintenance procedures prolong the life of any mechanical system. Traditional maintenance typically involves the use of paper-based manuals that are invariant to the user’s experience level or environment. There are known inefficiencies with traditional maintenance, including the large amount of time technicians spend locating and comprehending procedures and the inability of traditional systems to adapt to each user’s level of experience. For instance, in the aircraft industry, one study by Ott found that 45% of an Aircraft Maintenance Technician’s shift is spent finding and reading procedures [1]. Even with substantial time spent finding and reading procedures, ambiguity can arise when the technician adapts the procedure for the current situation [2]. For maintenance on mechatronic systems, training a novice operator requires months or sometimes years, while even expert technicians must reference paper-based manuals for infrequent or complex procedures [3]. These issues increase not only labor cost but also production cost [4]. As systems become more complex over time, the cost of these challenges will only become more exorbitant [5].

The shortcomings of the traditional paper-based manual maintenance approach necessitate an alternative solution to continue performing maintenance effectively on modern mechanical systems. One such solution is the use of augmented reality (AR) for maintenance applications. Recently, AR-based maintenance applications have gained widespread traction [6,7]. AR is able to display the information required to perform maintenance in a more intuitive and interactive way than the traditional maintenance approach. AR-based maintenance systems increase the effectiveness of the maintenance operator and speed up the entire maintenance workflow [2]. AR also provides the opportunity for maintenance applications to be context-aware by adapting to the user’s characteristics and preferences and the maintenance environment [8]. Applying AR to maintenance has been shown to have various advantages including [3]

• – the ability to employ less-skilled operators,

• – have up-to-date maintenance data,

• – have time and cost savings,

• – reduce the error rate of operators,

• – reduce the amount of text needed,

• – retain information in the AR system and not in people, and

• – adapt the level of information to each user’s skills.

While the application of AR to maintenance has resulted in great gains with respect to user efficiency and comprehension as compared to traditional maintenance, one of the aspects of AR-guided maintenance applications that is not conducive to the adaptable nature of AR applications is the rigidity of the maintenance procedures within these systems [9]. If the user encounters an issue with one of the machine’s parts while performing maintenance, many of the current AR-guided maintenance solutions either cannot handle the change in working conditions, or they rely on user knowledge or expert intervention to move forward. The presented solution instead relies upon automated reconfiguration strategies to aid the user in performing maintenance. An example of an automated reconfiguration process is shown in Fig. 1. The target part of the procedure is the motor (green highlighted) within a desktop fan. The user is successfully able to perform steps 1 through 5 but encounters an issue with the fan blades in step 6. Thus, the system attempts to provide a new maintenance procedure based on the defective part. For the example given in Fig. 1, a new maintenance procedure cannot be found, so the user is instructed to replace the entire subassembly of the defective part.

Fig. 1
Fig. 1
Close modal

While the ability to automatically reconfigure procedures was never a possibility with traditional paper-based maintenance manuals, it is a viable and attainable option with AR technology. Automated reconfiguration of AR-guided maintenance instructions is the primary focus of the outlined work.

A modification to the general framework used to create an AR-guided maintenance application is needed to allow for automated reconfiguring of procedures. The outlined framework in this paper is applicable to any AR-guided maintenance system provided that certain prerequisite steps are completed as described in Sec. 3. The two main contributions of this paper are as follows:

1. Present a novel framework to allow for automated reconfiguring of maintenance procedures.

2. Provide proof of concept for the framework by implementing it in a handheld AR-guided maintenance application.

The structure of this paper is as follows. In Sec. 2, a comprehensive review of solutions for procedure reconfiguration in AR-guided maintenance and (dis)assembly applications is provided. In Sec. 3, the novel framework for procedure creation that allows for reconfiguring is presented, and the methods to implement the framework are explained using a simple example. In Sec. 4, the implementation of the framework in a handheld AR application is presented. In Sec. 5, a discussion of the implementation, application, and implications of the framework is presented. Additionally, conclusive remarks and directions for future research are provided.

## 2 Literature Review

AR is a technology that supplements the real-world with virtual content in real-time [10]. The first conceptualization of AR and its counterpart virtual reality (VR) was theorized by Sutherland in his essay “The Ultimate Display” in 1965 [11]. More recently in 2001, Azuma et al. defined the properties of AR systems as combining real and virtual elements in the real-world, running in real time and interactively with the user, and registering real and virtual elements with respect to each other [10]. VR, while similar to AR, enables immersive user collaboration with virtual objects in a completely virtual environment.

There are many different types of AR including head-mounted, handheld, and projector-based. The type of AR chosen depends on various constraints including budget, scene characteristics, the system’s required portability, and the tracking system’s required measurement precision and reliability [72].

For this work, only the procedure creation component of AR-guided maintenance and (dis)assembly applications is focused upon. Many AR-guided maintenance and (dis)assembly projects have been implemented since the 1990s. For a review of existing literature in AR-guided maintenance applications in the last decade of the century, the reader is directed to the related work section of Ref. [73]. More recent reviews of AR-guided maintenance projects are provided in Refs. [6,7].

The procedure reconfiguration methods within recent AR-guided maintenance and (dis)assembly systems can be classified into three main categories: off-site reconfiguring, on-site reconfiguring, and automated reconfiguring. Off-site reconfiguring refers to modification that must be completed away from the location where maintenance is being performed either by the maintenance technician or by an expert. On-site reconfiguring refers to modification that can be completed at the location where maintenance is being performed either by the user or by a remote expert. Automated reconfiguring refers to modification that is performed by the application itself and does not require user or expert intervention. The trigger to start reconfiguration is not taken into account when considering these categories.

### 2.1 Off-Site Reconfiguring.

There are various AR-guided maintenance and (dis)assembly solutions found in the literature that include off-site reconfiguring abilities, meaning that reconfiguring can be completed away from the maintenance site by experts or operators.

All of the off-site reconfiguring strategies include unique methods to create AR content for procedures. The AR-guided maintenance prototype system presented in Ref. [74] uses a language for the creation of AR content called AR Media Language. Procedures are created based on computer-aided design (CAD) models, text, or images by associating them with specific workpieces. The solution presented in Ref. [75] includes easy AR authoring through PowerSpace, an AR authoring system. PowerSpace utilizes Microsoft PowerPoint [76] to allow the user to create slides that become procedures or augmentations on the screen. The Authoring for Context-Aware AR (ACAAR) system presented in Ref. [77] allows the user to model AR content through arranging various media files such as texts, images, and 3D models spatially and specifying the logical relationships between the AR content and contexts. The AR-guided maintenance system presented by Ref. [78] initially derives procedures from paper manuals but allows experts to create AR content using CAD or reverse engineering models and shape template matching. The Authoring Wizard presented in Ref. [79] allows assembly experts to create and modify assembly procedures for the Mixed Reality Assembly Instructor program that technicians use to perform assembly tasks. Various researchers have developed tools to create AR instructions based on video sequences of expert technicians performing procedures [8083]. These tools can be utilized for off-site reconfiguration by experts. AR instructions can also be automatically created from CAD model geometry [8487] or from a combination of information such as design, factory, and sales information as in Refs. [86,88], technical documentation and CAD models as in Ref. [89], or product, resource, and process information as in Refs. [90,91].

While all of these solutions are improvements over AR-guided maintenance applications that do not allow for procedure modification, none of these solutions are ideal because the authoring methods presented are not directly available to the operator while they are performing maintenance on-site. In this scenario, if the user encounters an issue with a part in the machine while on-site, they must leave the site, reconfigure the procedure, then return to the site to continue maintaining the machine.

### 2.2 On-Site Reconfiguring.

The systems proposed in the literature that include on-site reconfiguring methods are those that either allow for expert-driven procedure modification from a remote location over the Internet or allow for user-driven procedure modification at the location where maintenance is being performed.

The following systems only include remote expert-driven reconfiguration. The system within the ULTRA project presented in Ref. [92] includes a tele-consultation module that the operator can use to communicate with an expert. The expert is able to alter procedures using a template-based authoring tool. A remote reconfiguring solution is given in Ref. [9] where procedures are initially defined using a state machine, and tasks are defined for each part in the CAD model with elementary operations such as slide on/off, screw on/off, etc. Experts are able to remotely reconfigure or modify the procedures that are implemented by the technician. The AR-guided maintenance test system implemented in Ref. [93] includes a remote collaboration tool that allows the expert to modify the user’s 3D virtual annotations in real time. Reference [94] provides an AR-guided maintenance solution for industrial environments where experts within the support center of the industrial facility are able to add virtual information on the user’s display if needed. The predictive maintenance system in Ref. [95] allows experts to anchor annotations within the user’s field of view during a remote assistance call. Similarly, the AR training platform developed in Ref. [96] sends a video of the performed maintenance procedure to the expert trainer who is able to modify the procedure in real time or add “Virtual Post-Its” to the user’s screen for additional guidance. The Product Service System (PSS) AR solution in Ref. [84] includes a cloud-based remote maintenance solution that allows experts to exchange maintenance instructions with technicians. Likewise, the head-mounted cloud-based AR-guided maintenance solution presented in Ref. [97] allows operators to reach out to expert engineers when they encounter an issue while performing maintenance. The expert engineer then creates AR content remotely and projects the content onto the technician’s field of view.

There are systems in the literature that include both on-site user-driven reconfiguration and remote expert-driven reconfiguration. The Authorable Context-Aware AR System (ACARS) in Ref. [5] includes a bi-directional authoring tool where content can be created both off-site by experts and on-site by technicians. This expert-driven authoring tool provides a convenient method for users to implement significant changes to procedures, while the user-driven authoring tool is designed to allow operators to make smaller changes to procedures that are created by experts. The AR-assisted maintenance system (ARAMS) presented in Ref. [98] includes a bi-directional content creation tool and a remote collaboration mechanism that allows experts to modify procedures while the user is on-site. Finally, an AR-guided maintenance solution for the modern shop floor is given in Ref. [99]. The user has access to a database of predefined tasks that they can use if they encounter an issue while performing maintenance. If extensive modification is needed, the user has the ability to reach out to an expert engineer who can provide the user with new AR instructions.

While on-site reconfiguring methods provide operators with convenient access to solutions if they encounter an issue while performing maintenance, if experts are not available or if there is an unreliable Internet connection where the operator is performing maintenance, these methods fail. In the case of the bi-directional authoring tools as in Refs. [5,98,99], the operator is able to author content and modify procedures autonomously. This is ultimately an unsuitable solution because the quality of the produced procedures is dependent on the user’s level of knowledge and experience. This will then impact the success of reconfiguration in these systems. For these reasons, automated reconfiguration strategies are more reliable in the field.

### 2.3 Automated Reconfiguring.

There are few solutions present in the literature that include automated reconfiguring where the AR system itself can modify maintenance procedures on demand.

An interactive AR system was presented in Ref. [100] that incorporates context-awareness to detect a machine’s operating status after evaluation of the current condition of the machine. Procedures are automatically developed using condition information and graph-based algorithms. Reconfiguring is made possible by reevaluating the machine’s current state after every step in the procedure. Similarly, the AR assembly guidance system in Ref. [101] allows the user to choose the next task in the current procedure if multiple options are available based on the assembly structure of the machine. The visual assembly tree structure (VATS) used provides a kind of adaptive procedure creation strategy for automated reconfiguration. The AR-assisted product disassembly sequence planning (DSP) (ARDIS) system described in Ref. [102] utilizes a procedure regeneration strategy that automatically provides the user with a reconfigured procedure in the case of unforeseen obstacles. Procedure regeneration is only available via a web server.

Automated reconfiguring is the ideal option when compared to off- and on-site reconfiguring because it does not rely on any outside source or knowledge in order to produce a successful result. In cases where the operator is not experienced, does not have access to an expert, and encounters an issue with a part while performing maintenance that prevents them from moving forward with the procedure, automated reconfiguration is the only option. While the reconfiguring solutions presented in Refs. [100102] are able to perform automated reconfiguring, the solution in Ref. [100] is restricted to systems that have flow such as piping systems, the assembly sequences in Ref. [101] are restricted to the hierarchical relationships defined in the VATS, and the solution in Ref. [102] requires an Internet connection. All of these solutions are not universally acceptable in AR-guided maintenance applications. The framework proposed in Sec. 3 for procedure creation is designed to allow for automated reconfiguring of procedures in any AR-guided maintenance application.

Automated reconfiguring is directly related to the work done in DSP. Thus, a brief review of DSP methods is given next.

### 2.4 Disassembly Sequence Planning.

DSP is a field of research that is traditionally concerned with finding the optimal disassembly sequence for End-Of-Life (EOL) recovery of materials or components within products at the end of their lifecycles [103]. There are different types of DSP methods including complete or selective and destructive or non-destructive. Complete disassembly means that the machine or product is disassembled to its base components or elements. Selective disassembly means that the goal of the disassembly plan is to obtain certain target parts. Destructive DSP methods change the physical structure of the machine, making reassembly impossible. Non-destructive methods preserve the original physical structure of the components of the machine or product, making reassembly possible. When considering maintenance solutions, the only applicable DSP methods are selective, non-destructive methods [104]. With selective disassembly, the goal can be to either obtain a single target part or multiple target parts. Further, parts can be disassembled in either a serial or parallel manner [105].

In order to create a disassembly sequence, relevant information about the machine or product that is being disassembled must be utilized in an organized manner. There have been many representations for product information proposed in the literature including graph-based and matrix-based methods. Some of the most prevalent graphical methods include adjacency [106], connection/liaison [107,108], interference [108], precedence graphs [109111], AND/OR graphs [107,108,112], and Petri Nets (PNs) [113115]. Some of the most prevalent matrix representations include matrix-based versions of the adjacency [106,116], connection/liaison [108,117,118], interference [108,119123], and precedence matrices [111,119,124], contact matrices and translation matrices [102,122,123,125], transition matrices [122,126129], and disassembly constraints matrices [130,131]. Search trees are an additional DSP representation that are popular in literature [132134]. In cases of increasing product complexity, it is common to divide the product or machine into subassemblies either manually, as in Ref. [135], or automatically, as in Refs. [106108,117,122].

When considering DSP methods for EOL products and machines, product condition can vary widely. This can be a challenging problem especially with non-destructive DSP methods, as is required in maintenance applications. In order to attempt to account for product condition when creating a disassembly sequence, researchers have investigated mathematical models including fuzzy logic [136,137] and probability [138].

An operation must be performed on the representation of product information in order to create a disassembly sequence. In many cases, this operation is an optimization. Some of the most prevalent optimization algorithms used for DSP are genetic algorithms [110,125,130,139] and swarm optimization algorithms such as ant colony optimization [116,140] and the enhanced discrete bees algorithm [121]. For selective disassembly, some popular operations for DSP include recursive- or rule-based methods [141145] and the wave propagation approach, developed in Refs. [135,146149] and expanded in Refs. [109,150] for general DSP applications [151]. Various heuristic methods including genetic algorithm [152156], ant colony optimization [140,157,158], artificial bee colony algorithm [159], particle swarm optimization [160162], tabu search [163], and scatter search [164167] are also prevalent in the literature. Similarly, motion-planning or Rapidly-growing Random Tree-based algorithm (RRT) methods [168] and integer programming methods [112,169] are commonly used for selective disassembly operations. For a full review of selective disassembly sequence methods, the reader is referred to Refs. [141,170172]. The main objective for these DSP methods is to minimize the disassembly cost, where the cost can encompass disassembly time, time for tool changes, monetary cost, safety of the disassembly plan, etc. Other algorithms that are used to produce disassembly sequences include the branch and bound method [112,132,134] and graph-based search algorithms such as the Dijkstra algorithm and A* algorithm [109,110,114,173].

The framework presented in this work requires the use of an automatic procedure creation method in order to allow for reconfiguration. Automatic procedure creation can be implemented via DSP methods, as in Refs. [100,102,174176], or by other methods, such as from CAD model geometry in Refs. [8487], from a combination of information such as design, factory, and sales information as in Refs. [86,88], from technical documentation and CAD models as in Ref. [89], or from product, resource, and process information as in Refs. [90,91].

### 2.5 Complementary Pertinent Work.

The outlined framework in this paper builds upon complementary concepts developed by various researchers, including Belhadj et al. [117], Luo et al. [116], and Gungor and Gupta [177]. An overview of related pertinent concepts is outlined next.

Belhadj et al. presented a method in Ref. [117] to use part adjacency, fit, volume, and surface area information to automatically detect the subassemblies within a given assembly. Subassemblies are built upon base parts, which are parts within the assembly that have the highest number of connections, volume, and surface area based on a single-objective constrained optimization problem.

Luo et al. presented a multi-layer matrix representation that is used to store adjacency information for subassemblies of products [116]. This representation takes up less computational space when compared to one large adjacency matrix for the entire product. Additionally, the authors present two procedure creation methods, one being the Traversing searching algorithm (hereafter called the TSA) and the other being the Ant Colony Optimization method.

Lastly, Gungor and Gupta presented an algorithm for DSP with uncertainty [177]. Gungor and Gupta’s approach uses a product’s precedence matrix and information about defective parts within the product to analyze which parts are affected by the defective part and to disassemble the parts that are not directly affected by the defective part through DSP.

## 3 Automated Reconfiguration: Computational Approach

The overall framework, shown in Fig. 2, is an alternative to the typical procedure creation module within an AR-guided maintenance application. The various components in the framework are either performed offline or online. Offline components of the framework are created and loaded into the application before it is used, while online components of the framework are executed while the operator uses the application.

Fig. 2
Fig. 2
Close modal

The input to the overall framework is the CAD model of the machine being maintained. Adjacency analysis is performed on the CAD model to find the adjacency information for each part of the machine. Then, subassembly detection uses the adjacency information to find subassemblies within the machine. The adjacency and subassembly information is then used to create a procedure.

The reconfiguration process then begins. If a feasible maintenance procedure is created, the user starts to perform the maintenance procedure. If a feasible maintenance procedure is not created, the subassembly of the current part in the procedure or the defective part, if applicable, is replaced. The user then proceeds to perform the modified maintenance procedure. If reconfiguration of the procedure is required while the user is performing maintenance, the user is able to trigger the Defective Part Analysis (DPA). This analysis determines which parts are affected by the user-inputted defective part. The affected parts information is then used in procedure creation to create a new maintenance procedure for the user.

A running example of a desktop fan (Fig. 3)2 is used to elucidate the concepts presented in the following sub-sections. The following assumptions apply to the general framework:

• The adjacency information for the parts in the machine being maintained does not change over time.

• Only one machine is being maintained at any one time.

• There is only one target part for maintenance when a procedure is initially created.

• There is only one defective part when performing reconfiguration.

Fig. 3
Fig. 3
Close modal

Procedure creation has two prerequisite tasks: adjacency analysis and subassembly detection. Both of these tasks are performed offline before the application is used by the maintenance technician.

Adjacency analysis refers to the process of determining the relationships between all of the parts in the machine being maintained. The implemented adjacency analysis method in the outlined framework is a combination of the ideas presented by Refs. [117] and [116]. The two major differences between the two approaches are 1) the choice between a global and local coordinate system and 2) the decision of what adjacency direction information to record for each part in the machine. Reference [117] uses a global coordinate system, whereas Ref. [116] uses a local coordinate system. A local coordinate system bases each part’s direction of motion on the previous part’s direction of motion. For an AR system where the user’s or device’s perspective changes with time, non-specific directions of motion can become ambiguous to the user as parts are moved around the scene. For this reason, a global coordinate system was chosen for this implementation so that definitive Cartesian directions could be provided to the user during the maintenance procedure.

Regarding adjacency directions, Ref. [117] considers adjacency in the Cartesian directions in general while Ref. [116] considers adjacency in both the positive and negative Cartesian directions. In order to use a modified version of the TSA presented by Ref. [116] for procedure creation, adjacency information in both the positive and negative Cartesian directions was extracted for each part in the machine.

The implemented adjacency analysis method takes as input the CAD model of the machine being maintained. The final outputs from the adjacency analysis process are the adjacency and contact matrices.

The following assumptions are imposed for the inputted CAD model:

• All parts in the machine are available in the CAD model,

• All parts in the CAD model adhere to the same length convention,

• All parts in the CAD model adhere to the same global coordinate system, and

• All parts in the CAD model are rigid in their composition and position and orientation (pose).

First, the parts list and bounding box dimensions are extracted from the CAD model. The parts list and bounding box dimensions for the desktop fan example are given in Table 1.

Table 1

Part name and size information for the desktop fan example

Part numberPart nameBounding box, x (mm)Bounding box, y (mm)Bounding box, z (mm)
1Back cage133.35126.0672.46
2Front cage133.35126.3962.76
3Back plastic cover88.9087.8066.51
5Stand145.3176.9875.60
6Stand rivet (L)11.316.006.00
7Stand rivet1 (R)11.316.006.00
8Motor50.0064.2841.23
Part numberPart nameBounding box, x (mm)Bounding box, y (mm)Bounding box, z (mm)
1Back cage133.35126.0672.46
2Front cage133.35126.3962.76
3Back plastic cover88.9087.8066.51
5Stand145.3176.9875.60
6Stand rivet (L)11.316.006.00
7Stand rivet1 (R)11.316.006.00
8Motor50.0064.2841.23
Next, the adjacency matrix is created. The adjacency matrix is an n × n matrix where n is the total number of parts in the machine. Each diagonal element of the matrix is set to 111, while each off-diagonal element of the matrix holds three digits corresponding to the x, y, and z Cartesian directions, respectively. The digits are defined by Eq. (1) [117]. The adjacency matrix for the desktop fan example is given in Eq. (2).
$Adj(i,j)={1if partjdoes not prevent part i from moving in either the positive or negative direction2if partjprevents partifrom moving in the positive direction3if partjprevents partifrom moving in the negative direction4if partjprevents partifrom moving in both the positive and negative directions$
(1)
$Adj=[1234567811111121131124312443444422113111111443431111111113311211111111113111111144241134421111114111111114435421421121411111244344111634411111111134411111111172441111111112441111111118443112443442111111111111]$
(2)
The next step in adjacency analysis is the creation of the contact matrix. The contact matrix is an n × n matrix that contains information about how constricted the fit is between parts. This information will be utilized later on when determining the correct subassemblies for parts. Each diagonal element of the matrix is set to 1, while each off-diagonal element of the matrix can be defined based on Eq. (3) [117]. The contact matrix for the desktop fan example is given in Eq. (4).
$Contact(i,j)={1if the fit between partsiandjis loose (clearance fit)2if the fit between partsiandjis tight(forcedfit)$
(3)
$Contact=[12345678111111221211111111311111111411111112511111221621112111721112111811121111]$
(4)

### 3.2 Subassembly Detection.

Subassembly detection refers to the process of automatically finding combinations of parts in the machine being maintained in order to introduce modularity into the procedure creation process and to permit reconfiguration. The implemented subassembly detection method is derived from Ref. [117], while the storage of the resulting subassembly matrices is derived from Ref. [116]. While Ref. [116] derives subassemblies of the machine from the CAD model’s Bill of Materials (BOM), the method presented by Ref. [117] is preferable because it is based solely on the geometry of the parts in the machine and not on the defined relationships within the CAD model, which can vary greatly depending upon the designing style of the CAD creator. The only major modification made to the subassembly detection method presented by Ref. [117] is that the relationships between subassemblies are not broken. Since procedures are being created with this adjacency information, the relationships between subassemblies must be preserved. The adjacency and contact matrices from the adjacency analysis are inputs to the subassembly detection process. The output from subassembly detection is a series of matrices that hold the adjacency information for the detected subassemblies.

The first step in subassembly detection is to perform element-wise concatenation between the adjacency and contact matrices. Each element of the combined matrix then has digits corresponding to the contact fit and x, y, and z adjacencies, respectively. Equation (5) shows the combined matrix for the desktop fan example.
$Combined=[12345678111111112111311121431224423441442211131111111114431431111111111113311121111111111111131111111111442411131442111111111411111111112443514211421112114111111224423441111623441111111111112344111111111111722441111111111112244111111111111814431112144324421111111111111111]$
(5)

The next step in the process of finding subassemblies includes finding the first-level base parts for the machine. As described above, base parts form the foundation of each subassembly. The first-level base parts are the initial base parts found for the machine. As a prerequisite to finding the first-level base parts, the number of connections, surface area, and volume of each part is calculated. The number of connections is calculated by considering the elements of each row of the adjacency matrix. Digits of 1 are considered as no connection, digits of 2 or 3 are considered as one connection, and digits of 4 are considered as two connections. The surface area and volume for each part are calculated based on the bounding boxes of each part.

A single-objective constrained optimization problem is solved to determine the base parts for each subassembly. The optimization problem formulation is given in Eq. (6) where α, β, and γ are the decision variables to be optimized, Si is the surface area for each part, St is the total surface area for all parts, Nr is the total number of connections for each part, Vi is the volume for each part, and Vt is the total volume for all parts [117].
$maxα,β,γFni=αSiSt+βNr+γViVts.t.0≤α,β,γ≤1α+β+γ=1$
(6)
The decision variables α, β, and γ are weights in the objective function that affect surface area, number of connections, and volume, respectively. These weights are optimized for each part in order to maximize the overall function value. In this way, parts that have a large surface area, a high number of connections, and a large volume are preferred. Since base parts form the foundation of subassemblies, it then follows that parts with the highest objective values are chosen as base parts. Specifically, parts with an Fni value greater than the average Fni value are considered base parts. The decision variable values and objective function results for the desktop fan example are given in Table 2.
Table 2

Information required for and results of initial base parts identification for desktop fan example

Part numberPart nameSi (mm2)NrVi (mm3)Optimal α valuesOptimal β valuesOptimal γ valuesFni
1Back cage8.55E+04211.22E+061.00E+002.24E-074.30E-042.67E-01
2Front cage8.33E+04101.06E+061.00E+002.96E-093.28E-042.60E-01
3Back plastic cover4.29E+0475.19E+052.97E-033.24E-079.97E-011.28E-01
5Stand5.64E+04198.46E+051.00E+002.34E-074.26E-041.76E-01
6Stand rivet (L)3.43E+02104.07E+028.95E-051.35E-071.00E+001.05E-04
7Stand rivet1 (R)3.44E+02104.07E+022.59E-035.88E-089.97E-011.05E-04
8Motor1.82E+04161.33E+053.96E-042.50E-071.00E+003.26E-02
StVtAverage Fni
3.21E+054.07E+061.17E-01
Part numberPart nameSi (mm2)NrVi (mm3)Optimal α valuesOptimal β valuesOptimal γ valuesFni
1Back cage8.55E+04211.22E+061.00E+002.24E-074.30E-042.67E-01
2Front cage8.33E+04101.06E+061.00E+002.96E-093.28E-042.60E-01
3Back plastic cover4.29E+0475.19E+052.97E-033.24E-079.97E-011.28E-01
5Stand5.64E+04198.46E+051.00E+002.34E-074.26E-041.76E-01
6Stand rivet (L)3.43E+02104.07E+028.95E-051.35E-071.00E+001.05E-04
7Stand rivet1 (R)3.44E+02104.07E+022.59E-035.88E-089.97E-011.05E-04
8Motor1.82E+04161.33E+053.96E-042.50E-071.00E+003.26E-02
StVtAverage Fni
3.21E+054.07E+061.17E-01

After the identification of the initial base parts, an analysis is completed to identify parts in the machine that are connected to more than one base part. For each part that is connected to more than one base part, the weights of the connections between the current part and all possible base parts are calculated. This process is done by adding together the digits for the element of the combined matrix corresponding to the connection between the current part and each base part. A higher weight indicates that the connection has the tightest fit and the most amount of connections between the current and corresponding base part. Thus, the connection with the highest weight is kept. If the weight is the same between the current part and all possible base parts, the connection with the highest optimization function value is kept.

The adjacency information from the combined matrix for the first-level base parts is saved in a separate matrix according to Luo et al.’s multi-layer adjacency matrix representation [116]. The adjacency information for the base parts of each subassembly is recalculated based on the newly found subassemblies, which results in changes to the contact fit information and number of connections for each part. The process of finding base parts and subassemblies is then repeated for each base part to create a hierarchical subassembly structure.

The final step for base part determination and subassembly detection involves including the adjacency information for any parts that are not included in the base parts structure into the first-level subassembly. The resulting structure of base parts and subassemblies for the desktop fan example is given in Fig. 4.

Fig. 4
Fig. 4
Close modal

The methods put forth by Refs. [116,117] only find one level of base parts and subassemblies. A hierarchy of base parts and subassemblies is found in this work in order to give increased flexibility to the reconfiguration process. The reconfiguration process works by finding a reconfigured procedure for the parts within the subassembly of the defective part. Having additional subassemblies and more layers allow for smaller units to be replaced if a reconfigured procedure cannot be found.

### 3.3 Procedure Creation.

Procedure creation uses adjacency and subassembly information in order to create an ordered list of parts that make up the final maintenance procedure. The initial procedure is created based on a user-defined target part, which is the part of the machine that needs to be maintained. Procedures are created for each subassembly in segments depending upon how many subassemblies are between the first-level subassembly and the target part. The first-level subassembly represents the machine as a whole and contains both the initial subassemblies and parts that do not belong to any subassembly. The final procedure is a combination of each procedure segment.

The procedure creation method presented is based on DSP theory. The DSP method chosen for this implementation is a modification of the TSA given by Ref. [116]. The implemented method, hereafter called the modified traversing searching algorithm (MTSA), uses a suboptimal strategy to find a sufficient disassembly sequence. This sequence is then converted into the final overall procedure. The computational flow of the MTSA is illustrated in Fig. 5.

Fig. 5
Fig. 5
Close modal

The inputs to the MTSA include the base parts and subassembly adjacency information, along with the user-defined target part. The MTSA is run for each hierarchical subassembly until the subassembly that contains the target part. When creating a new procedure, it is assumed that the procedure begins with the whole machine assembly. While it is possible to begin a procedure with a specified part, this stipulation is enforced in order to simplify the implementation.

In order to demonstrate the usage of the framework, a maintenance procedure is developed to replace the motor, which is part 8 or motor, highlighted in Fig. 3, in a desktop fan. Since the target part is a member of the second-level subassembly, the MTSA is run for both the first-level and second-level subassemblies in order to create the overall procedure.

A preliminary step to finding the initial procedure is adding the objective function values for the procedure into the diagonal elements of each subassembly adjacency matrix. The objective chosen for this implementation is the minimization of the total procedure time. The following assumptions are upheld for the objective function values in order to simplify this procedure creation implementation:

• It is assumed that the disassembly time for each part is directly proportional to that part’s total number of connections.

• It is assumed that the disassembly time for each part is the same as the assembly time for each part.

• It is assumed that the disassembly time for each part is the same in all three Cartesian directions.

Based on these assumptions, the number of connections for each part is inputted into the corresponding diagonal element in each subassembly adjacency matrix. The modified first-level subassembly adjacency matrix for the desktop fan example is shown in Eq. (7).
$FirstLevel=[1235671231442144314312244234421443811111431111111113144211116113111111111514211421112117224423446234411111111234410111172244111111112244111110]$
(7)
The MTSA starts by splitting the inputted subassembly matrix into separate matrices for each Cartesian direction. For procedure creation, the first digit in each off-diagonal element of the subassembly matrix corresponding to the contact fit of the connection is omitted. Next, the rows in each of the three Cartesian matrices are sorted based on the predetermined objective and objective function values for each part. The sorted Cartesian matrices for the first-level subassembly adjacency matrix for the desktop fan example are shown in Eq. (8).
(8)

The theory behind the MTSA is that an attempt is made to find a disassembly direction with no 4s and either only 2s or only 3s for each part until retrieving the target part. If the row in the current Cartesian direction fits these qualifications, the part corresponding to the current row is added to the disassembly sequence in the direction opposite to the direction of its connections. The row and column that correspond to this part are then removed from all three Cartesian matrices. The MTSA then reiterates the process from the first or x-direction and first row in the matrices. Since the rows are sorted according to the objective function and the heuristic of the number of connections is used for the objective values, this method produces a suboptimal result. A suboptimal procedure is sufficient in this case for the purposes of showing the applicability and usage of the framework.

During the MTSA, each part in the subassembly is considered by its corresponding row in each Cartesian matrix. At the beginning of each iteration of the algorithm, the row corresponding to the target part is checked to see if it can be removed. If one of the directions in the row corresponding to the target part does not contain any 4s, contains only 2s or only 3s, or has no connections, the target part can be removed and is added to the new disassembly procedure in the direction opposite to the part’s other connections. If there are no connections with the target part in any direction, the default direction is set to the positive x-direction. Once the target part is retrieved, the algorithm ends. The algorithm iterates through the possible directions and rows, starting with the direction equal to zero, indicating that the current direction is x, and the row equal to zero. For the current row, the number of 4s is recorded. If the row contains no 4s and either only 2s or only 3s, the part is disassembled in the direction opposite to the direction it has connections to.

The resulting disassembly sequence from each subassembly until the subassembly containing the target part is combined to produce the final disassembly sequence. A reverse sequence of the overall disassembly sequence is appended onto the final sequence in order to guide the user in reassembling the machine after performing maintenance on the target part.

For the desktop fan example, the overall procedure is derived from the first-level and second-level subassembly matrices. The target part within the first-level subassembly matrix is part 1, back cage, while the target part within the second-level subassembly matrix is part 8, motor. The MTSA starts by checking if the target part can be removed easily in any direction, meaning that there are no four connections and only two or only three connections in the row corresponding to the target part in a certain direction. Part 1 or back cage in the sixth row of the first-level subassembly matrices cannot easily be removed because it contains four connections, so the algorithm starts with the x-direction and the first row and attempts to remove parts in order to free the target part. The first row, corresponding to part 3 or back plastic cover, cannot be removed because it contains four connections. The next row corresponding to part 2 or front cage, also cannot be removed due to four connections. Next, part 6 or stand rivet in row 3 can be removed in the positive x-direction because it only contains three connections. Part 6 is added to the disassembly plan in the positive x-direction and the adjacency information for part 6 is removed from all three Cartesian matrices. The MTSA then restarts by checking the target part for easy removal. Part 1 or back cage still cannot easily be removed in any direction, so the algorithm restarts from the first row in the x-direction. The next part that can be removed is part 7 or stand rivet1 in the negative x-direction. Part 7 is added to the disassembly plan and its adjacency information is removed from the Cartesian matrices. The algorithm then proceeds check if part 1 or back cage can be removed easily in any direction. Since part 1 or back cage is still not removable, the algorithm starts over from the first row in the x-direction. The algorithm proceeds and must remove part 5 or stand in the negative y-direction and part 3 or back plastic cover in the negative z-direction before being able to easily remove part 1 or back cage in the negative z-direction. From here, the algorithm moves onto the second-level subassembly matrix with part 8 or motor being the target part. The target part, part 8 or motor, cannot easily be removed in any direction, so the algorithm begins trying to remove parts with the first row in the x-direction. Part 1 or back cage represents the second-level subassembly and has already been disassembled from the machine, so it is skipped. Part 4 or blades can be removed in the positive z-direction, so it is added to the disassembly plan and part 4’s adjacency information is removed from the second-level Cartesian matrices. The target part, part 8 or motor, can then be removed in the positive z-direction. The resulting procedure is {6 in +x, 7 in −x, 5 in −y, 3 in −z, 1 in −z, 4 in +z, 8 in +z, 8 in −z, 4 in −z, 1 in +z, 3 in +z, 5 in +y, 7 in +x, 6 in −x}. A graphical depiction of the maintenance procedure is shown in Fig. 6.

Fig. 6
Fig. 6
Close modal

### 3.4 Reconfiguration.

Reconfiguration refers to the strategy implemented to allow for modification of maintenance procedures. The implemented reconfiguration method is performed online and is built upon the DSP method described in Sec. 3.3. For the implementation in the presented framework, an analysis about which parts are affected by the defective part is included, the DPA. The DPA is derived from the work of Ref. [177]. An overview of the DPA is given in Fig. 7.

Fig. 7
Fig. 7
Close modal

Reconfiguration takes as input a user-defined defective part, which is the part that the operator is currently having an issue with, and outputs a list of parts that are affected by the defective part within the subassembly of the defective part. This information is used by the procedure creation algorithm to attempt to create a new procedure for the segment of the procedure that corresponds to the subassembly containing the defective part. This is done by executing the procedure creation process while keeping the adjacency information of the affected parts immovable within the subassembly of the defective part. If the procedure can be reconfigured within the subassembly, the overall procedure is updated, and the user can continue performing maintenance. If a feasible procedure cannot be found, the user is instructed to replace the subassembly that contains the defective part.

Since a feasible maintenance procedure was created from the procedure creation process for the desktop fan example, the user is able to begin performing the resulting maintenance procedure. The overall procedure found in Sec. 3.3 for the desktop fan example was {6 in +x, 7 in −x, 5 in −y, 3 in −z, 1 in −z, 4 in +z, 8 in +z, 8 in −z, 4 in −z, 1 in +z, 3 in +z, 5 in +y, 7 in +x, 6 in −x}. While the user is performing maintenance, if part 4, blades, is found to be defective, the user triggers the DPA.

#### 3.4.1 Defective Part Analysis.

The inputs to the DPA are the subassembly matrix of the defective part and the disassembly sequence corresponding to this subassembly. The subassembly adjacency matrix is saved to a local matrix, DP. For the desktop fan example, this is the second-level subassembly matrix. The current disassembly sequence for the subassembly of the defective part is saved to a vector, Φ. For the desktop fan example, this is [4, 8]. The DPA begins by initializing a vector, DV, that contains binary information about whether or not each part in the subassembly of the defective part is defective. For the desktop fan, DV is [0, 1, 0] corresponding to the parts in the second-level subassembly: 1, 4, and 8. The vectors Ψ and Ω are initialized and will contain the parts that cannot be disassembled with non-destructive methods and the set of parts that are affected by the defective part, respectively.

The DPA starts by analyzing each part in Φ. If the current part is not defective, this part is removed from Φ and the adjacency information for the part is removed from DP. If the current part is defective, this part and all of its adjacent parts are added to Ω. For the desktop fan, the algorithm starts with the first part in Φ, part 4 or blades. This part is defective, so all of the parts it is adjacent to are added to Ω so that Ω = [4, 8]. After analyzing each part, the parts in Ω are added to Ψ to produce Ψ = [4, 8] and removed from Φ, so Φ = [].

After the first analysis, a second check is conducted. For each part in Ψ, if the part is not defective and does not have any adjacency relationships in DP with any other parts, the part is removed from Ψ and DP. Otherwise, the part is kept in Ψ. For the desktop fan, the first part in Ψ, 4 or blades, is defective and thus stays in Ψ. The next part, part 8 or motor, is not defective but does have adjacency with other parts. Thus, part 8 is kept in Ψ. The final Ψ vector is given as [4, 8]. Since the final Ψ vector contains the target part, part 8 or motor, the subassembly containing this target part is replaced. The reconfigured procedure is given as {Replace back cage}. The user then continues performing maintenance by replacing this subassembly. A depiction of the reconfigured maintenance procedure is given in Fig. 8.

Fig. 8
Fig. 8
Close modal

## 4 Augmented Reality Application

### 4.1 Overview.

In order to demonstrate the efficacy of the proposed framework on a more complex machine, a MakerBot Replicator 2X 3D printer was chosen as an application problem (Fig. 9). One part that requires regular maintenance in a 3D printer is the leadscrew that provides upward and downward motion for the build plate. The framework was tested within an Android mobile application on Google Pixel 3 and Samsung Galaxy S6 smartphones.

Fig. 9
Fig. 9
Close modal

When the operator first opens the app, the user must align the Guide View for the MakerBot (Fig. 10) with the real-life object in order to load the procedure reconfiguration interface of the application and the MakerBot 3D model AR augmentation. Once the interface of the application is loaded (Fig. 11(a)), the user is able to either create a new maintenance procedure, choose to perform a predefined procedure, or modify a predefined procedure manually. In order to begin the maintenance of the leadscrew, the operator chooses to create a new maintenance procedure. The user is then brought to a screen where they can choose the target part for the procedure (Fig. 11(b)). In this case, part 15 or Lead Screw is chosen. The maintenance procedure is automatically generated using the algorithms described in the preceding sections. The final procedure produced for the leadscrew is {20 in +x, 21 in −x, 2 in −x, 3 in −x, 4 in +x, 5 in +x, 6 in +x, 14 in −y, 17 in +y, 15 in +y, 15 in −y, 17 in −y, 14 in +y, 6 in −x, 5 in −x, 4 in −x, 3 in +x, 2 in +x, 21 in +x, 20 in −x}. For each step in the procedure, a Guide View similar to that shown in Fig. 10 corresponding to the relevant part or subassembly is displayed to the user. Once the user aligns the camera view with the Guide View, 3D model augmentations for the part or subassembly are loaded (Fig. 11(c)) in order to aid the user in identifying the relevant part or subassembly for that step of the procedure. While the user is performing maintenance, they encounter an issue with part 20 or side panel and cannot continue performing the procedure. They then select the “Reconfigure” button to start the reconfiguration process (Fig. 11(c)). In this case, part 20 or side panel is considered the defective part for the DPA. The resultant affected parts (Ψ) from the DPA include parts 20, 4, and 5. The procedure creation process (MTSA) is then rerun with parts 20, 4, and 5 marked as immovable. The new procedure is given as {21 in −x, 2 in −x, 3 in −x, 6 in −x, 14 in −y, 17 in +y, 15 in +y, 15 in −y, 17 in −y, 14 in +y, 6 in +x, 3 in +x, 2 in +x, 21 in +x}. The final maintenance procedure, which includes the original procedure and reconfigured procedure due to the defective part, is given in Fig. 12. The in-field computation times for the MTSA, procedure creation process, DPA, and reconfiguration process are given in Table 3.

Fig. 10
Fig. 10
Close modal
Fig. 11
Fig. 11
Close modal
Fig. 12
Fig. 12
Close modal
Table 3

In-field times for MTSA, procedure creation, DPA, and reconfiguration on the Google Pixel 3 and Samsung Galaxy S6 smartphones

PhoneTarget partOriginal disassembly sequenceMTSA run time [s]Procedure creation time [s]Defective partReconfigured disassembly sequenceDPA run time [s]Reconfiguration time [s]
Google Pixel 315, Lead Screw{20 = +x, 21 = −x, 2 = −x, 3 = −x, 4 = +x, 5 = +x, 6 = +x, 14 = −y, 17 = +y, 15 = +y}≤12920, side panel{21 = −x, 2 = −x, 3 = −x, 6 = −x, 14 = −y, 17 = +y, 15 = +y}≤176
Samsung Galaxy S6≤152≤1166
PhoneTarget partOriginal disassembly sequenceMTSA run time [s]Procedure creation time [s]Defective partReconfigured disassembly sequenceDPA run time [s]Reconfiguration time [s]
Google Pixel 315, Lead Screw{20 = +x, 21 = −x, 2 = −x, 3 = −x, 4 = +x, 5 = +x, 6 = +x, 14 = −y, 17 = +y, 15 = +y}≤12920, side panel{21 = −x, 2 = −x, 3 = −x, 6 = −x, 14 = −y, 17 = +y, 15 = +y}≤176
Samsung Galaxy S6≤152≤1166

### 4.2 Implementation Details.

A visual depiction of the application software architecture is provided in Fig. 13. The parts list, part bounding box dimensions, adjacency matrix, and contact matrix were computed using a macro in freecad [178] and saved to a text file that is inputted into the subassembly detection module. The developed script automatically moved each part in the CAD model with respect to every other part in the positive and negative Cartesian directions. For the adjacency matrix, the part was moved 10 mm in each direction. If there was a collision between the parts during or after this movement, the parts were said to be adjacent. For the contact matrix, the parts were moved 1 mm in each direction to test the fit of the parts. The subassembly detection method was implemented in matlab [179], taking as input the text file produced from adjacency analysis and outputting a text file containing the multi-level subassembly matrices and the base parts structure. Both the freecad macro and matlab code are run offline to be used by the online procedure creation steps.

Fig. 13
Fig. 13
Close modal

The MTSA is implemented in Android Studio [180] online while the operator is using the application. When the user selects a target part to create a new maintenance procedure, as shown in Fig. 11(b), the algorithm searches for the target part in the base parts structure and runs the MTSA algorithm for all subassemblies up to and including that of the target part.

The reconfiguring process is also implemented in Android Studio [180] and is triggered by the press of a button by the user (Fig. 11(c)). The current part in the procedure is then deemed the defective part and the DPA is run.

Supplementary task information for each part is organized in an ontology that is stored in local storage on the AR device and utilized by the AR application. The ontology was developed offline using Protégé [181] and loaded into Android Studio [180] in the OWL format. The Jena reasoner [182] is used to validate the ontology and to infer relations between the instances. Task information is matched with each part or subassembly in the procedure as the current step is loaded within the application.

The AR portion of the application was created using the vuforia Software Development Kit (SDK) [183] in Android Studio [180]. For the whole object in addition to the individual parts and subassemblies within the machine, Model Targets were created based on Wavefront .obj files using Vuforia’s Model Target Generator [183]. The Guide Views for these Model Targets are presented to the user during both the initial activity of the application and for each step of the maintenance procedure in order to register the 3D augmentations. Once the Guide View is properly aligned with the real object in detection, CAD model augmentations are registered with respect to the real object and tracking of the real object begins. CAD model augmentations were created based on Wavefront .obj and .mtl files. A c++ parser, Tiny OBJ Loader [184], was used to load the .obj and .mtl files in Android Studio [180]. OpenGL for Embedded Systems (ES) [185] was used to display the .obj augmentations on the screen of the devices.

## 5 Discussion and Conclusions

The framework outlined in this paper allows for the reconfiguration of procedures to be included in any AR-guided maintenance application regardless of the type of AR used, the application for the system, and the environment in which the system is being used. While researchers have implemented various reconfiguration strategies to permit users to continue performing maintenance when they encounter issues, these solutions only work locally for each implemented project and are limited to user knowledge and experience or expert availability. The presented framework has the ability to eliminate the need for expert availability during maintenance. This drastically reduces the time the user spends waiting for a solution and thus reduces maintenance costs overall.

The implementation of the presented framework has limitations. First, the adjacency analysis of the machine is restricted to six directions. In order to make the adjacency analysis and thus procedure more accurate, additional directions of motion should be considered, in linear and rotational directions. Second, the automatically determined subassemblies may not align with subassemblies within the real machine. While the autonomous nature of the subassembly detection method is convenient if the replacement of a subassembly is needed, replacing the detected subassembly may not be reasonable in real life. However, for a machine, the overall subassembly information is generally available a priori and can be provided directly as manual input to the framework, thus avoiding automated subassembly detection. Third, a major limitation of the implementation described is that if a feasible reconfigured procedure cannot be found, the entire subassembly containing the defective part is replaced.

The application of the framework requires improvements to increase computational speed and efficiency. As seen in Table 3, the computational times for both the MTSA in procedure creation and the DPA in reconfiguration are low (≤1 s). The original TSA in Ref. [116] uses an exhaustive search strategy to identify all possible disassembly sequences in order to obtain the target part. While the MTSA is able to reduce the computational time for the exhaustive search by stopping after obtaining the first feasible disassembly sequence, the MTSA produces a suboptimal disassembly sequence. In order to produce an optimized disassembly sequence for procedure creation, an efficient searching strategy should be used to obtain a disassembly plan such as genetic algorithms, swarm optimization algorithms, or graph searching algorithms as described in Sec. 2.4. Contrastingly, the actual wait times in order to obtain the original procedure in procedure creation and the modified procedure in reconfiguration are very high when compared to the computation times for the MTSA and DPA, respectively. The computational bottleneck within the application has been identified as the ontology. Ontology is an artificial intelligence (AI)-based dynamic data storage structure that has advantages over databases because of its ability to use ontological reasoners to validate the ontology and infer knowledge between instances. Databases, on the other hand, simply hold key-value pairs. For additional information about the use of ontology and other AI techniques within AR applications, the reader is referred to Ref. [186]. The specific issues encountered during implementation include the following: (1) the creation of ontology instances, (2) the overwriting of new information to ontology instances, (3) the validation of the ontology instances with the Jena reasoner, and (4) the retrieval of the properties for the instances from within the ontology. Based on the high wait times when utilizing and modifying this information, alternative ontology formats and reasoners should be investigated for use in this application or the use of simpler databases such as extensible markup language (xml) or relational database management systems (RDBMS) should be considered.

There are many AR SDKs available such as ARToolKit [187], Wikitude [188], Kudan [189], ARCore [190], ARKit [191], EasyAR [192], and DroidAR [193]. These SDKs include various AR tracking strategies including marker-based, feature-based, and model-based. These AR tracking strategies require varying inputs for training before use or reference while in use. Because CAD models are required as input to the presented framework, an SDK with model-based tracking, such as vuforia [183], was utilized in the application of the framework. Other tracking strategies may offer increased speed and efficiency for the application of the presented framework. Future implementations of this framework should explore the use of other AR SDKs and tracking strategies for increased computational speed and efficiency.

The AR SDK used in this application, vuforia [183], can be used within Unity [194], Android Studio [180], Xcode [195], and Visual Studio [196]. For the presented application, AR capabilities through vuforia [183] were integrated into a prototype maintenance application developed in Android Studio [180]. Game engine platforms, such as Unity [194], are commonly used to develop AR applications because they are optimized to handle 3D environments and complex user interactions, which are prevalent in AR systems. The utilization of game engines to increase the speed and efficiency of the application of this framework should be investigated in future research.

The presented AR application not only helps the user by producing an updated procedure on demand but also allows the user to make their own modifications to procedures. Additionally, users are able to add notes and safety warnings to procedures for additional assistance the next time that procedure is performed. In order to improve the usability of the developed application, a full test with more complex machinery and real maintenance workers in a realistic working environment would have to be conducted. These realistic tests would be insightful to assess the real-time usability of handheld AR within the maintenance environment.

Although there are innovations to be made with the presented application including reducing the wait time for procedures and testing the application in more realistic and complex maintenance environments, the developed framework and ideology remains invariant to these improvements. The framework is an effective tool that can be used to provide additional aid and autonomy to the user by performing automated AR-based reconfiguration of maintenance procedures. When implemented, the proposed framework allows any user of any experience level the ability to perform maintenance effectively.

## Acknowledgment

The research outlined in the paper is based upon work supported by Naval Surface Warfare Center (NSWC) under the NEEC award No. N00174-19-0025.

## Conflict of Interest

There are no conflicts of interest.

## Data Availability Statement

The authors attest that all data for this study are included in the paper.

## References

1.
Ott
,
J.
,
1995
, “
Maintenance Executives Seek Greater Efficiency
,”
Aviation Week Space Technol.
,
142
(
20
), pp.
43
44
.
2.
Henderson
,
S.
, and
Feiner
,
S.
,
2010
, “
Exploring the Benefits of Augmented Reality Documentation for Maintenance and Repair
,”
IEEE Trans. Vis. Comput. Graph.
,
17
(
10
), pp.
1355
1368
.
3.
Fiorentino
,
M.
,
Uva
,
A. E.
,
Gattullo
,
M.
,
Debernardis
,
S.
, and
Monno
,
G.
,
2014
, “
Augmented Reality on Large Screen for Interactive Maintenance Instructions
,”
Comput. Indus.
,
65
(
2
), pp.
270
278
.
4.
Garza
,
L. E.
,
Pantoja
,
G.
,
Ramírez
,
P.
,
Ramírez
,
H.
,
Rodríguez
,
N.
,
González
,
E.
,
Quintal
,
R.
, and
Pérez
,
J. A.
,
2013
, “
Augmented Reality Application for the Maintenance of a Flapper Valve of a Fuller-Kynion Type M Pump
,”
Proc. Comput. Sci.
,
25
, pp.
154
160
.
5.
Zhu
,
J.
,
Ong
,
S. K.
, and
Nee
,
A.
,
2013
, “
An Authorable Context-Aware Augmented Reality System to Assist the Maintenance Technicians
,”
,
66
(
9–12
), pp.
1699
1714
.
6.
Dini
,
G.
, and
Dalle Mura
,
M.
,
2015
, “
Application of Augmented Reality Techniques in Through-Life Engineering Services
,”
Proc. CIRP
,
38
, pp.
14
23
.
7.
Palmirini
,
R.
,
Erkoyuncu
,
J. A.
, and
Roy
,
R.
,
2017
, “
An Innovative Process to Select Augmented Reality (AR) Technology for Maintenance
,”
Proc. CIRP
,
59
, pp.
23
28
.
8.
Martinetti
,
A.
,
,
M.
, and
van Dongen
,
L.
,
2017
, “
Shaping the Future Maintenance Operations: Reflections on the Adoptions of Augmented Reality Through Problems and Opportunities
,”
Proc. CIRP
,
59
(
1
), pp.
14
17
.
9.
Lamberti
,
F.
,
Manuri
,
F.
,
Paravati
,
G.
,
Piumatti
,
G.
, and
Sanna
,
A.
,
2016
, “
Using Semantics to Automatically Generate Speech Interfaces for Wearable Virtual and Augmented Reality Applications
,”
IEEE Trans. Human-Mach. Syst.
,
47
(
1
), pp.
152
164
.
10.
Azuma
,
R.
,
Baillot
,
Y.
,
Behringer
,
R.
,
Feiner
,
S.
,
Julier
,
S.
, and
MacIntyre
,
B.
,
2001
, “
,”
IEEE Comput. Graphics Appl.
,
21
(
6
), pp.
34
47
.
11.
Sutherland
,
I. E.
,
1965
, “
The Ultimate Display
,”
IFIPS Congress
,
New York City
,
65
, No.
2
, pp.
506
508
.
12.
Lyu
,
M. R.
,
King
,
I.
,
Wong
,
T. T.
,
Yau
,
E.
, and
Chan
,
P. W.
,
2005
, “
ARCADE: Augmented Reality Computing Arena for Digital Entertainment
,”
2005 IEEE Aerospace Conference
, pp.
1
9
.
13.
Von Itzstein
,
G. S.
,
Billinghurst
,
M.
,
Smith
,
R. T.
, and
Thomas
,
B. H.
,
2017
,
Augmented Reality Entertainment: Taking Gaming Out of the Box
,
Springer International Publishing
,
Cham
, pp.
1
9
.
14.
Berman
,
B.
, and
Pollack
,
D.
,
2021
, Strategies for the Successful Implementation of Augmented Reality. Business Horizons.
15.
Feng
,
Y.
, and
Mueller
,
B.
,
2019
, “
The State of Augmented Reality Advertising Around The Globe: A Multi-Cultural Content Analysis
,”
J. Promot. Manage.
,
25
(
4
), pp.
453
475
.
16.
Hopp
,
T.
, and
,
H.
,
2016
, “
Novelty Effects in Augmented Reality Advertising Environments: The Influence of Exposure Time and Self-Efficacy
,”
,
37
(
2
), pp.
113
130
.
17.
Singh
,
P.
, and
Pandey
,
M.
,
2014
, “
Augmented Reality Advertising: An Impactful Platform for New Age Consumer Engagement
,”
,
16
(
2
), pp.
24
28
.
18.
Geiger
,
P.
,
Schickler
,
M.
,
Pryss
,
R.
,
Schobel
,
J.
, and
Reichert
,
M.
,
2014
, “
Location-Based Mobile Augmented Reality Applications: Challenges, Examples, Lessons Learned
,”
10th International Conference on Web Information Systems and Technologies (WEBIST 2014)
, Special Session on Business Apps, pp.
383
394
.
19.
Kourouthanassis
,
P. E.
,
Boletsis
,
C.
, and
Lekakos
,
G.
,
2015
, “
Demystifying the Design of Mobile Augmented Reality Applications
,”
Multi. Tools Appl.
,
74
(
3
), pp.
1045
1066
.
20.
Bursali
,
H.
, and
Yilmaz
,
R. M.
,
2019
, “
Effect of Augmented Reality Applications on Secondary School Students’ Reading Comprehension and Learning Permanency
,”
Comput. Human Behav.
,
95
, pp.
126
135
.
21.
Majid
,
N. A. A.
,
Mohammed
,
H.
, and
Sulaiman
,
R.
,
2015
, “
Students’ Perception of Mobile Augmented Reality Applications in Learning Computer Organization
,”
Proc.- Soc. Behav. Sci.
,
176
, pp.
111
116
.
22.
Tekedere
,
H.
, and
Göker
,
H.
,
2016
, “
Examining the Effectiveness of Augmented Reality Applications in Education: A Meta-Analysis
,”
Int. J. Environ. Sci. Edu.
,
11
(
16
), pp.
9469
9481
.
23.
Tzima
,
S.
,
Styliaras
,
G.
, and
Bassounas
,
A.
,
2019
, “
Augmented Reality Applications in Education: Teachers Point of View
,”
Educ. Sci.
,
9
(
2
), p.
99
.
24.
Özdemir
,
M.
, and
Demir
,
M.
,
2018
, “
The Effect of Augmented Reality Applications in the Learning Process: A Meta-Analysis Study
,”
Eurasian J. Educ. Res. (EJER)
,
74
, pp.
165
186
.
25.
Lang
,
S.
,
Dastagir Kota
,
M. S. S.
,
Weigert
,
D.
, and
Behrendt
,
F.
,
2019
, “
Mixed Reality in Production and Logistics: Discussing the Application Potentials of Microsoft HoloLensTM
,”
Proc. Computer Sci.
,
149
, pp.
118
129
.
26.
Ong
,
S.
,
Yuan
,
M.
, and
Nee
,
A.
,
2008
, “
Augmented Reality Applications in Manufacturing: A Survey
,”
Int. J. Prod. Res.
,
46
(
10
), pp.
2707
2742
.
27.
Dodevska
,
Z. A.
, and
Mihić
,
M. M.
,
2018
, “
Augmented Reality and Virtual Reality Technologies in Project Management: What Can We Expect?
,”
Eur. Project Manage. J.
,
8
(
1
), pp.
17
24
.
28.
Lin
,
T. H.
,
Liu
,
C. H.
,
Tsai
,
M. H.
, and
Kang
,
S. C.
,
2015
, “
Using Augmented Reality in a Multiscreen Environment for Construction Discussion
,”
J. Comput. Civil Eng.
,
29
(
6
), p.
04014088
.
29.
Rankohi
,
S.
, and
Waugh
,
L.
,
2013
, “
Review and Analysis of Augmented Reality Literature for Construction Industry
,”
Vis. Eng.
,
1
(
1
), p.
9
.
30.
Furata
,
H.
,
Takahashi
,
K.
,
Nakatsu
,
K.
,
Ishibashi
,
K.
, and
Aira
,
M.
,
2012
, “
A Mobile Application System for Sightseeing Guidance Using Augmented Reality
,”
The 6th International Conference on Soft Computing and Intelligent Systems and The 13th International Symposium on Advanced Intelligence Systems
, pp.
1903
1906
.
31.
Sasaki
,
R.
, and
Yamamoto
,
K.
,
2019
, “
A Sightseeing Support System Using Augmented Reality and Pictograms Within Urban Tourist Areas in Japan
,”
ISPRS Int. J. Geo-Inform.
,
8
(
9
), p.
381
.
32.
Song
,
M.
,
Mokhov
,
S. A.
,
Mudur
,
S. P.
, and
Bustros
,
J.
,
2015
, “
Demo: Towards Historical Sightseeing With An Augmented Reality Interactive Documentary App
,”
2015 IEEE Games Entertainment Media Conference (GEM)
,
1
2
.
33.
,
R.
,
Ma
,
M.
, and
Temple
,
N.
,
2016
, “Augmented Reality and Gamification in Heritage Museums,”
Serious Games
,
Ma
,
M.
,
Oliveira
,
M.F.
,
Baalsrud Hauge
,
J.
, and
Göbel
,
S.
, eds.,
Springer International Publishing
,
Cham
, pp.
181
187
.
34.
Serravalle
,
F.
,
Ferraris
,
A.
,
Vrontis
,
D.
,
Thrassou
,
A.
, and
Christofi
,
M.
,
2019
, “
Augmented Reality in the Tourism Industry: A Multi-Stakeholder Analysis of Museums
,”
Tourism Manage. Perspect.
,
32
, p.
100549
.
35.
Yoon
,
S. A.
, and
Wang
,
J.
,
2014
, “
Making the Invisible Visible in Science Museums Through Augmented Reality Devices
,”
TechTrends
,
58
(
1
), pp.
49
55
.
36.
Chen
,
L.
,
Day
,
T. W.
,
Tang
,
W.
, and
John
,
N. W.
,
2017
, “
Recent Developments and Future Challenges in Medical Mixed Reality
,”
2017 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)
, pp.
123
135
.
37.
Rymer
,
M. T.
,
Damiano
,
E. S.
,
McCombs
,
B.
, and
De La Torre
,
R.
,
2018
, “
Using Augmented Reality and Mobile Technologies to Train Automotive Technicians
,”
2018 IEEE International Conference on Teaching, Assessment, and Learning for Engineering (TALE)
, pp.
1074
1078
.
38.
Westerfield
,
G.
,
Mitrovic
,
A.
, and
Billinghurst
,
M.
,
2015
, “
Intelligent Augmented Reality Training for Motherboard Assembly
,”
Int. J. Artificial Intell. Educ.
,
25
(
1
), pp.
157
172
.
39.
Chen
,
Y. J.
,
Lai
,
Y. S.
, and
Lin
,
Y. H.
,
2020
, “
BIM-Based Augmented Reality Inspection and Maintenance of Fire Safety Equipment
,”
Auto. Const.
,
110
, p.
103041
.
40.
Shin
,
D. H.
, and
Dunston
,
P. S.
,
2009
, “
Evaluation of Augmented Reality in Steel Column Inspection
,”
Auto. Const.
,
18
(
2
), pp.
118
129
.
41.
Webster
,
A.
,
Feiner
,
S.
,
MacIntyre
,
B.
,
Massie
,
W.
, and
Krueger
,
T.
,
1996
, “
Augmented Reality in Architectural Construction, Inspection, and Renovation
,”
Comput. Civil Eng.
,
1
, pp.
913
919
.
42.
Zhou
,
Y.
,
Luo
,
H.
, and
Yang
,
Y.
,
2017
, “
Implementation of Augmented Reality for Segment Displacement Inspection During Tunneling Construction
,”
Auto. Const.
,
82
, pp.
112
121
.
43.
Hock
,
P.
,
Benedikter
,
S.
,
Gugenheimer
,
J.
, and
Rukzio
,
E.
,
2017
, “
CarVR: Enabling In-Car Virtual Reality Entertainment
,” pp.
4034
4044
.
44.
Kodama
,
R.
,
Koge
,
M.
,
Taguchi
,
S.
, and
Kajimoto
,
H.
,
2017
, “
COMS-VR: Mobile Virtual Reality Entertainment System Using Electric Car and Head-mounted Display
,”
2017 IEEE Symposium on 3D User Interfaces (3DUI)
, pp.
130
133
.
45.
Guttentag
,
D. A.
,
2010
, “
Virtual Reality: Applications and Implications for Tourism
,”
Tourism Manage.
,
31
(
5
), pp.
637
651
.
46.
Jung
,
T.
, and
Moorhouse
,
N.
,
2017
, “
Tourists’ Experience of Virtual Reality Applications
,”
2017 IEEE International Conference on Consumer Electronics (ICCE)
, pp.
208
210
.
47.
Vitali
,
A.
, and
Rizzi
,
C.
,
2018
, “
Acquisition of Customer’s Tailor Measurements for 3D Clothing Design Using Virtual Reality Devices
,”
Virtual Phys. Protot.
,
13
(
3
), pp.
131
145
.
48.
Juraschek
,
M.
,
Büth
,
L.
,
Posselt
,
G.
, and
Herrmann
,
C.
,
2018
, “
Mixed Reality in Learning Factories
,”
Proc. Manufact.
,
23
, pp.
153
158
. “
Advanced Engineering Education & Training for Manufacturing Innovation
8th CIRP Sponsored Conference on Learning Factories (CLF 2018)
.
49.
Moore
,
H. F.
, and
Gheisari
,
M.
,
2019
, “
A Review of Virtual and Mixed Reality Applications in Construction Safety Literature
,”
Safety
,
5
(
3
), p.
51
.
50.
Smith
,
J. W.
, and
Salmon
,
J. L.
,
2017
, “
Development and Analysis of Virtual Reality Technician-Training Platform and Methods
,”
I/ITSEC 2017: Interservice/Industry Training, Simulation, and Education Conference
, pp.
1
12
.
51.
Kamińska
,
D.
,
Sapiński
,
T.
,
Wiak
,
S.
,
Tikk
,
T.
,
Haamer
,
R. E.
,
Avots
,
E.
,
Helmi
,
A.
,
Ozcinar
,
C.
, and
Anbarjafari
,
G.
,
2019
, “
Virtual Reality and Its Applications in Education: Survey
,”
Information
,
10
(
10
), p.
318
.
52.
,
J.
,
Majchrzak
,
T. A.
,
Fromm
,
J.
, and
Wohlgenannt
,
I.
,
2020
, “
A Systematic Review of Immersive Virtual Reality Applications for Higher Education: Design Elements, Lessons Learned, and Research Agenda
,”
Comput. Educ.
,
147
, p.
103778
.
53.
Bouchlaghem
,
N.
, and
Liyanage
,
I.
,
1996
, “
Virtual Reality Applications in the UK’s Construction Industry
,”
Cib Rep.
, pp.
89
94
.
54.
Kizil
,
M.
,
2003
, “
Virtual Reality Applications in the Australian Minerals Industry
,”
APCOM 2003: 31st International Symposium on Application of Computers and Operations Research in the Mineral Industries
, pp.
569
574
.
55.
Zhang
,
Y.
,
Liu
,
H.
,
Kang
,
S. C.
, and
Al-Hussein
,
M.
,
2020
, “
Virtual Reality Applications for the Built Environment: Research Trends and Opportunities
,”
Auto. Const.
,
118
, p.
103311
.
56.
Choi
,
S.
,
Jung
,
K.
, and
Noh
,
S. D.
,
2015
, “
Virtual Reality Applications in Manufacturing Industries: Past Research, Present Findings, and Future Directions
,”
Concurrent Eng.
,
23
(
1
), pp.
40
63
.
57.
Mujber
,
T.
,
Szecsi
,
T.
, and
Hashmi
,
M.
,
2004
, “
Virtual Reality Applications in Manufacturing Process Simulation
,”
J. Mater. Process. Technol.
,
155–156
, pp.
1834
1838
.
Proceedings of the International Conference on Advances in Materials and Processing Technologies: Part 2
.
58.
Ahmed
,
S.
,
2018
, “
A Review on Using Opportunities of Augmented Reality and Virtual Reality in Construction Project Management
,”
Organ., Tech. Manage. Const.: Int. J.
,
10
, pp.
1839
1852
.
59.
Müller
,
M.
,
Günther
,
T.
,
Kammer
,
D.
,
Wojdziak
,
J.
,
Lorenz
,
S.
, and
Groh
,
R.
,
2016
, “
Smart Prototyping - Improving the Evaluation of Design Concepts Using Virtual Reality
,”
VAMR 2016: International Conference on Virtual, Augmented and Mixed Reality
, Vol.
9740
, pp.
47
58
.
60.
,
J.
,
Wiederhold
,
B. K.
, and
Riva
,
G.
,
2016
, “
Future Directions: How Virtual Reality Can Further Improve the Assessment and Treatment of Eating Disorders and Obesity
,”
Cyberpsychol., Behav., Soc. Netw.
,
19
(
2
), pp.
148
153
.
61.
Moline
,
J.
,
1997
, “
Virtual Reality for Health Care: a Survey
,”
Studies Health Tech. Inform.
,
44
, pp.
3
34
.
62.
Pallavicini
,
F.
,
Argenton
,
L.
,
Toniazzi
,
N.
,
Aceti
,
L.
, and
Mantovani
,
F.
,
2016
, “
Virtual Reality Applications for Stress Management Training in the Military
,”
Aeros. Med. Human Perform.
,
87
, pp.
1021
1030
.
63.
Riva
,
G.
,
Mantovani
,
F.
, and
Gaggioli
,
A.
,
2004
, “
Presence and Rehabilitation: Toward Second-generation Virtual Reality Applications in Neuropsychology
,”
J. NeuroEng. Rehabil.
,
1
(
1
), p.
9
.
64.
Rizzo
,
A. A.
,
Schultheis
,
M.
,
Kerns
,
K. A.
, and
Mateer
,
C.
,
2004
, “
Analysis of Assets for Virtual Reality Applications in Neuropsychology
,”
Neuropsychol. Rehabil.
,
14
(
1–2
), pp.
207
239
.
65.
Stanica
,
I. C.
,
Dascalu
,
M. I.
,
Moldoveanu
,
A.
,
Bodea
,
C. N.
, and
Hostiuc
,
S.
,
2016
, “
A Survey of Virtual Reality Applications As Psychotherapeutic Tools to Treat Phobias
,”
eLSE 2016: The 12th International Scientific Conference eLearning and Software for Education
, Vol.
1
, pp.
392
399
.
66.
Jin
,
X.
,
Xu
,
J.
,
Wang
,
C. C. L.
,
Huang
,
S.
, and
Zhang
,
J.
,
2008
, “
Interactive Control of Large-Crowd Navigation in Virtual Environments Using Vector Fields
,”
IEEE Comput. Graphics Appl.
,
28
(
6
), pp.
37
46
.
67.
Rasti
,
D.
, and
Rasti
,
A.
,
2018
, “
Augmented Reality Framework and Demonstrator
,”
Masters Thesis
,
Tampere University
,
Helsinki
.
68.
Carmigniani
,
J.
,
Furht
,
B.
,
Anisetti
,
M.
,
Ceravolo
,
P.
,
Damiani
,
E.
, and
Ivkovic
,
M.
,
2011
, “
Augmented Reality Technologies, Systems and Applications
,”
Multi. Tools Appl.
,
51
(
1
), pp.
341
377
.
69.
,
N.
, and
Awang Rambli
,
D.
,
2012
, “
A Survey of Mobile Augmented Reality Applications
,”
2012 1st International Conference on Future Trends in Computing and Communication Technologies
, pp.
89
96
.
70.
Mazuryk
,
T.
, and
Gervautz
,
M.
,
1996
, Virtual Reality History, Applications, Technology and Future. Technical Report TR-186-2-96-06, Institute of Computer Graphics and Algorithms, Vienna University of Technology, Favoritenstrasse 9-11/E193-02, A-1040 Vienna, Austria.
71.
Costanza
,
E.
,
Kunz
,
A.
, and
Fjeld
,
M.
,
2009
,
Human Machine Interaction
(
Lecture Notes in Computer Science
, Vol.
5440
,
Springer
,
Berlin, Heidelberg
, pp.
47
68
.
72.
Oliveira
,
R.
,
Farinha
,
T.
,
Raposo
,
H.
, and
Pires
,
N.
,
2014
, “
Augmented Reality and the Future of Maintenance
,”
Proceedings of Maintenance Performance Measurement and Management (MPMM).
73.
Schmalstieg
,
D.
,
Fuhrmann
,
A.
,
Hesina
,
G.
,
Szalavári
,
Z.
,
Encarnaçao
,
L. M.
,
Gervautz
,
M.
, and
Purgathofer
,
W.
,
2002
, “
The Studierstube Augmented Reality Project
,”
Presence: Teleoperators Virtual Environ.
,
11
(
1
), pp.
33
54
.
74.
Neumann
,
U.
, and
Majoros
,
A.
,
1998
, “
Cognitive, Performance, and Systems Issues for Augmented Reality Applications in Manufacturing and Maintenance
,”
Proceedings of the IEEE 1998 Virtual Reality Annual International Symposium (Cat. No. 98CB36180)
,
Atlanta, GA
,
IEEE
, pp.
4
11
.
75.
Haringer
,
M.
, and
Regenbrecht
,
H. T.
,
2002
, “
A Pragmatic Approach to Augmented Reality Authoring
,”
Proceedings of the International Symposium on Mixed and Augmented Reality
,
,
IEEE
, pp.
237
245
.
76.
Microsoft
. Powerpoint. https://www.microsoft.com/en-us/microsoft-365/powerpoint (1987–2020).
77.
Zhu
,
J.
,
Ong
,
S. K.
, and
Nee
,
A. Y.
,
2015
, “
A Context-Aware Augmented Reality Assisted Maintenance System
,”
Int. J. Comput. Int. Manufact.
,
28
(
2
), pp.
213
225
.
78.
De Crescenzio
,
F.
,
Fantini
,
M.
,
Persiani
,
F.
,
Di Stefano
,
L.
,
Azzari
,
P.
, and
Salti
,
S.
,
2010
, “
Augmented Reality for Aircraft Maintenance Training and Operations Support
,”
IEEE Comput. Graphics Appl.
,
31
(
1
), pp.
96
101
.
79.
Zauner
,
J.
,
Haller
,
M.
,
Brandl
,
A.
, and
Hartman
,
W.
,
2003
, “
Authoring of a Mixed Reality Assembly Instructor for Hierarchical Structures
,”
The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003
,
Tokyo, Japan
,
IEEE
, pp.
237
246
.
80.
Bhattacharya
,
B.
, and
Winer
,
E. H.
,
2019
, “
Augmented Reality Via Expert Demonstration Authoring (AREDA)
,”
Comput. Indus.
,
105
, pp.
61
79
.
81.
Mura
,
K.
,
Petersen
,
N.
,
Huff
,
M.
, and
Ghose
,
T.
,
2013
, “
IBES: A Tool for Creating Instructions Based on Event Segmentation
,”
Front. Psychol.
,
4
, p.
994
.
82.
Petersen
,
N.
,
Pagani
,
A.
, and
Stricker
,
D.
,
2013
, “
Real-Time Modeling and Tracking Manual Workflows From First-Person Vision
,”
2013 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)
,
, pp.
117
124
.
83.
Petersen
,
N.
, and
Stricker
,
D.
,
2012
, “
Learning Task Structure From Video Examples for Workflow Tracking and Authoring
,”
2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)
,
Atlanta, GA
,
IEEE
, pp.
237
246
.
84.
Mourtzis
,
D.
,
Zogopoulos
,
V.
, and
Vlachou
,
E.
,
2017
, “
Augmented Reality Application to Support Remote Maintenance As a Service in the Robotics Industry
,”
Proc. CIRP
,
63
(
2017
), pp.
46
51
.
85.
Rentzos
,
L.
,
Papanastasiou
,
S.
,
Papakostas
,
N.
, and
Chryssolouris
,
G.
,
2013
, “
Augmented Reality for Human-Based Assembly: Using Product and Process Semantics
,”
IFAC Proc. Vol.
,
46
(
15
), pp.
98
101
.
86.
Salonen
,
T.
,
Saääki
,
J.
,
Woodward
,
C.
,
Korkalo
,
O.
,
Marstio
,
I.
, and
Rainio
,
K.
,
2009
, “
Data Pipeline From CAD to AR Based Assembly Instructions
,”
ASME World Conference on Innovative Virtual Reality
, Vol.
43376
, pp.
165
168
.
87.
Wang
,
Z.
,
Shen
,
Y.
,
Ong
,
S. K.
, and
Nee
,
A. Y. C.
,
2009
, “
Assembly Design and Evaluation Based on Bare-Hand Interaction in An Augmented Reality Environment
,”
2009 International Conference on CyberWorlds
,
,
IEEE
, pp.
21
28
.
88.
Sääski
,
J.
,
Salonen
,
T.
,
Hakkarainen
,
M.
,
Siltanen
,
S.
,
Woodward
,
C.
, and
Lempiäinen
,
J.
,
2008
, “
Integration of Design and Assembly Using Augmented Reality
,”
S.
Ratchev
, and
S.
Koelemeijer
, eds. Micro-Assembly Technologies and Applications. IPAS 2008. IFIP – International Federation for Information Processing, vol 260. Springer, Boston, MA. .
89.
Mohr
,
P.
,
Kerbl
,
B.
,
Donoser
,
M.
,
Schmalstieg
,
D.
, and
Kalkofen
,
D.
,
2015
, “
Retargeting Technical Documentation to Augmented Reality
,”
Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems
,
Seoul, South Korea
, pp.
3337
3346
.
90.
Serván
,
J.
,
Mas
,
F.
,
Menéndez
,
J.
, and
Ríos
,
J.
,
2012
, “
Assembly Work Instruction Deployment Using Augmented Reality
,” Key Engineering Materials, Vol.
502
,
Trans Tech Publ
, pp.
25
30
.
91.
Serván
,
J.
,
Mas
,
F.
,
Menéndez
,
J.
, and
Ríos
,
J.
,
2012
, “
Using Augmented Reality in AIRBUS A400M Shop Floor Assembly Work Instructions
,”
AIP Conference Proceedings
, Vol.
1431
,
American Institute of Physics
, pp.
633
640
.
92.
Makri
,
A.
,
Weidenhausen
,
J.
,
Eschler
,
P.
,
Stricker
,
D.
,
Machui
,
O.
,
Fernandes
,
C.
,
Maria
,
S.
,
Voss
,
G.
, and
Ioannidis
,
N.
,
2005
, “
ULTRA Light Augmented Reality Mobile System
,”
Proceedings of the ISMAR
,
2005
.
93.
Vorraber
,
W.
,
Gasser
,
J.
,
Webb
,
H.
,
Neubacher
,
D.
, and
Url
,
P.
,
2020
, “
Assessing Augmented Reality in Production: Remote-Assisted Maintenance With HoloLens
,”
Proc. CIRP
,
88
, pp.
139
144
.
94.
Obermair
,
F.
,
Althaler
,
J.
,
Seiler
,
U.
,
Zeilinger
,
P.
,
Lechner
,
A.
,
Pfaffeneder
,
L.
,
Richter
,
M.
, and
Wolfartsberger
,
J.
,
2020
, “
Maintenance With Augmented Reality Remote Support in Comparison to Paper-Based Instructions: Experiment and Analysis
,”
IEEE 7th International Conference on Industrial Engineering and Applications (ICIEA)
,
Bangkok, Thailand
,
IEEE
, pp.
942
947
.
95.
Wolfartsberger
,
J.
,
Zenisek
,
J.
, and
Wild
,
N.
,
2020
, “
Data-Driven Maintenance: Combining Predictive Maintenance and Mixed Reality-Supported Remote Assistance
,”
Proc. Manufact.
,
45
, pp.
307
312
.
96.
Webel
,
S.
,
Bockholt
,
U.
,
Engelke
,
T.
,
Gavish
,
N.
,
Olbrich
,
M.
, and
Preusche
,
C.
,
2013
, “
An Augmented Reality Training Platform for Assembly and Maintenance Skills
,”
Rob. Auton. Syst.
,
61
(
4
), pp.
398
403
.
97.
Mourtzis
,
D.
,
Siatras
,
V.
, and
Angelopoulos
,
J.
,
2020
, “
Real-Time Remote Maintenance Support Based on Augmented Reality (AR)
,”
Appl. Sci.
,
10
(
5
), p.
1855
.
98.
Ong
,
S.
, and
Zhu
,
J.
,
2013
, “
A Novel Maintenance System for Equipment Serviceability Improvement
,”
CIRP. Ann.
,
62
(
1
), pp.
39
42
.
99.
Mourtzis
,
D.
,
Xanthi
,
F.
, and
Zogopoulos
,
V.
,
2019
, “
An Adaptive Framework for Augmented Reality Instructions Considering Workforce Skill
,”
Proc. CIRP
,
81
, pp.
363
368
.
100.
Neges
,
M.
,
Wolf
,
M.
, and
Abramovici
,
M.
,
2015
, “
Secure Access Augmented Reality Solution for Mobile Maintenance Support Utilizing Condition-Oriented Work Instructions
,”
Proc. CIRP
,
38
, pp.
58
62
.
101.
Yuan
,
M.
,
Ong
,
S.
, and
Nee
,
A. Y.
,
2005
, Assembly guidance in augmented reality environments using a virtual interactive tool. Innovation in Manufacturing Systems and Technology (IMST).
102.
Chang
,
M.
,
Nee
,
A.
, and
Ong
,
S.
,
2020
, “
Interactive AR-Assisted Product Disassembly Sequence Planning (ARDIS)
,”
Int. J. Prod. Res.
,
58
, pp.
1
16
.
103.
Frizziero
,
L.
, and
Liverani
,
A.
,
2020
, “
Disassembly Sequence Planning (DSP) Applied to a Gear Box: Comparison Between Two Literature Studies
,”
Appl. Sci.
,
10
(
13
), p.
4591
.
104.
Frizziero
,
L.
,
Liverani
,
A.
,
Caligiana
,
G.
,
Donnici
,
G.
, and
Chinaglia
,
L.
,
2019
, “
Design for Disassembly (DfD) and Augmented Reality (AR): Case Study Applied to a Gearbox
,”
Machines
,
7
(
2
), p.
29
.
105.
Woo
,
T. C.
, and
Dutta
,
D.
,
1991
, “
Automatic Disassembly and Total Ordering in Three Dimensions
,”
ASME J. Eng. Ind.
,
113
(
2
), pp.
207
213
.
106.
Dutta
,
D.
, and
Woo
,
T. C.
,
1995
, “
Algorithm for Multiple Disassembly and Parallel Assemblies
,”
ASME J. Eng. Ind.
,
117
(
1
), pp.
102
109
.
107.
Lambert
,
A. J.
,
2002
, “
Determining Optimum Disassembly Sequences in Electronic Equipment
,”
Comput. Indus. Eng.
,
43
(
3
), pp.
553
575
.
108.
Ong
,
N.
, and
Wong
,
Y.
,
1999
, “
Automatic Subassembly Detection From a Product Model for Disassembly Sequence Generation
,”
,
15
(
6
), pp.
425
431
.
109.
García
,
M. A.
,
Larré
,
A.
,
López
,
B.
, and
Oller
,
A.
,
2000
, “
Reducing the Complexity of Geometric Selective Disassembly
,”
Proceedings of the 2000 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2000)(Cat. No. 00CH37113)
,
Takamatsu, Japan
, Vol.
2
,
IEEE
, pp.
1474
1479
.
110.
Tseng
,
H. E.
,
Chang
,
C. C.
,
Lee
,
S. C.
, and
Huang
,
Y. M.
,
2018
, “
A Block-Based Genetic Algorithm for Disassembly Sequence Planning
,”
Expert. Syst. Appl.
,
96
, pp.
492
505
.
111.
Tseng
,
Y. J.
,
Kao
,
H. T.
, and
Huang
,
F. Y.
,
2010
, “
Integrated Assembly and Disassembly Sequence Planning Using a GA Approach
,”
Int. J. Prod. Res.
,
48
(
20
), pp.
5991
6013
.
112.
Kim
,
H. W.
, and
Lee
,
D. H.
,
2018
, “
A Sample Average Approximation Algorithm for Selective Disassembly Sequencing With Abnormal Disassembly Operations and Random Operation Times
,”
,
96
(
1–4
), pp.
1341
1354
.
113.
,
S.
,
Perrard
,
C.
, and
Henrioud
,
J. M.
,
2003
, “
On Disassembly Workshop Model Integration for Disassembly Planning
,”
Proceedings of the IEEE International Symposium on Assembly and Task Planning
,
Besancon, France
,
IEEE
, pp.
157
162
.
114.
Enomoto
,
A.
,
Aoyama
,
Y.
,
Yamauchi
,
Y.
, and
Yamamoto
,
N.
,
2016
, “
Near Optimal Assembly Sequence Generation
,”
2016 IEEE/SICE International Symposium on System Integration (SII)
,
Sapporo, Japan
,
IEEE
, pp.
95
101
.
115.
Reveliotis
,
S. A.
,
2007
, “
Uncertainty Management in Optimal Disassembly Planning Through Learning-Based Strategies
,”
IIE Trans.
,
39
(
6
), pp.
645
658
.
116.
Luo
,
Y.
,
Peng
,
Q.
, and
Gu
,
P.
,
2016
, “
Integrated Multi-Layer Representation and Ant Colony Search for Product Selective Disassembly Planning
,”
Comput. Indus.
,
75
, pp.
13
26
.
117.
,
I.
,
Trigui
,
M.
, and
Benamara
,
A.
,
2016
, “
Subassembly Generation Algorithm From a CAD Model
,”
,
87
(
9-12
), pp.
2829
2840
.
118.
Trigui
,
M.
,
,
I.
, and
Benamara
,
A.
,
2017
, “
Disassembly Plan Approach Based on Subassembly Concept
,”
,
90
(
1–4
), pp.
219
231
.
119.
Agrawal
,
D.
,
Kumara
,
S.
, and
Finke
,
D.
,
2014
, “
Automated Assembly Sequence Planning and Subassembly Detection
,”
Proceedings of the IIE Annual Conference
.
Institute of Industrial and Systems Engineers (IISE)
, pp.
781
120.
Huang
,
Y. M.
, and
Huang
,
C. T.
,
2002
, “
Disassembly Matrix for Disassembly Processes of Products
,”
Int. J. Prod. Res.
,
40
(
2
), pp.
255
273
.
121.
Liu
,
J.
,
Zhou
,
Z.
,
Pham
,
D. T.
,
Xu
,
W.
,
Ji
,
C.
, and
Liu
,
Q.
,
2018
, “
Robotic Disassembly Sequence Planning Using Enhanced Discrete Bees Algorithm in Remanufacturing
,”
Int. J. Prod. Res.
,
56
(
9
), pp.
3134
3151
.
122.
Mircheski
,
I.
,
Pop-Iliev
,
R.
, and
Kandikjan
,
T.
,
2016
, “
A Method for Improving the Process and Cost of Nondestructive Disassembly
,”
ASME J. Mech. Des.
,
138
(
12
), p.
121701
.
123.
Mircheski
,
I.
, and
Rizov
,
T.
,
2017
, “
Improved Nondestructive Disassembly Process Using Augmented Reality and RFID Product/Part Tracking
,”
TEM J.
,
6
(
4
), pp.
671
677
.
124.
Gungor
,
A.
, and
Gupta
,
S. M.
,
1998
, “
Disassembly Sequence Planning for Complete Disassembly in Product Recovery
,”
Proceedings of the 1998 Northeast Decision Sciences Institute Conference
, pp.
250
252
.
125.
Briceno
,
J. J.
, and
Pochiraju
,
K.
,
2007
, “
Automatic Disassembly Plan Generation From CAD Assembly Models
,”
2007 IEEE International Symposium on Assembly and Manufacturing
, pp.
64
69
.
126.
,
S.
,
Kwak
,
M.
,
Kim
,
H.
, and
Thurston
,
D.
,
2010
, “
Simultaneous Selective Disassembly and End-of-Life Decision Making for Multiple Products That Share Disassembly Operations
,”
ASME J. Mech. Des.
,
132
(
4
), p.
041002
.
127.
Kang
,
C. M.
,
Kwak
,
M. J.
,
Cho
,
N. W.
, and
Hong
,
Y. S.
,
2010
, “
Automatic Derivation of Transition Matrix for End-of-Life Decision Making
,”
Int. J. Prod. Res.
,
48
(
11
), pp.
3269
3298
.
128.
Lambert
,
A. J. D.
,
2001
, “
Automatic Determination of Transition Matrices in Optimal Disassembly Sequence Generation
,”
Proceedings of the 2001 IEEE International Symposium on Assembly and Task Planning (ISATP2001). Assembly and Disassembly in the Twenty-first Century
. (Cat. No.01TH8560), pp.
220
225
.
129.
Yu
,
B.
,
Wu
,
E.
,
Chen
,
C.
,
Yang
,
Y.
,
Yao
,
B.
, and
Lin
,
Q.
,
2017
, “
A General Approach to Optimize Disassembly Sequence Planning Based on Disassembly Network: A Case Study From Automotive Industry
,”
,
12
, pp.
305
320
.
130.
Parsa
,
S.
, and
,
M.
,
2019
, “
Intelligent Selective Disassembly Planning Based on Disassemblability Characteristics of Product Components
,”
,
104
, pp.
1769
1783
.
131.
Wang
,
H.
,
Peng
,
Q.
,
Zhang
,
J.
, and
Gu
,
P.
,
2017
, “
Selective Disassembly Planning for the End-of-Life Product
,”
Proc. CIRP
,
60
, pp.
512
517
.
Complex Systems Engineering and Development Proceedings of the 27th CIRP Design Conference Cranfield University, UK, May 10–12, 2017
.
132.
Costa
,
C. M.
,
Veiga
,
G.
,
Sousa
,
A.
,
Rocha
,
L.
,
Oliveira
,
E.
,
Lopes Cardoso
,
H.
, and
Thomas
,
U.
,
2018
, “
Automatic Generation of Disassembly Sequences and Exploded Views From Solidworks Symbolic Geometric Relationships
,”
2018 IEEE International Conference on Autonomous Robot Systems and Competitions (ICARSC)
, pp.
211
218
.
133.
Zhang
,
H. C.
, and
Kuo
,
T. C.
,
1997
, “
A Graph-Based Disassembly Sequence Planning for EOL Product Recycling
,”
Twenty-First IEEE/CPMT International Electronics Manufacturing Technology Symposium Proceedings 1997 IEMT Symposium
, pp.
140
151
.
134.
Zhang
,
X. F.
, and
Zhang
,
S. Y.
,
2010
, “
Product Cooperative Disassembly Sequence Planning Based on Branch-and-Bound Algorithm
,”
,
51
(
9–12
), pp.
1139
1147
.
135.
Srinivasan
,
H.
, and
,
R.
,
1998
, “
A Geometric Algorithm for Single Selective Disassembly Using the Wave Propagation Abstraction
,”
Comput.-Aided Design
,
30
(
8
), pp.
603
613
.
136.
Kongar
,
E.
,
Gupta
,
S.
, and
Al-Turki
,
Y.
,
2002
, “
A Fuzzy Goal Programming Approach to Disassembly Planning
,”
The 6th Saudi Engineering Conference
,
KFUPM, Dhahran
.
137.
Kongar
,
E.
, and
Gupta
,
S. M.
,
2006
, “
Disassembly to Order System Under Uncertainty
,”
Omega
,
34
(
6
), pp.
550
561
.
138.
Ruijun
,
L.
,
Guangdong
,
T.
,
Xueyi
,
Z.
,
Anyan
,
Z.
,
Xiaolan
,
W.
, and
Qingning
,
N.
,
2011
, “
Disassembly Sequence Optimization for Automotive Product Based on Probabilistic Planning Method
,”
2011 International Conference on Consumer Electronics, Communications and Networks (CECNet)
, pp.
284
288
.
139.
Hui
,
W.
,
Dong
,
X.
, and
Guanghong
,
D.
,
2008
, “
A Genetic Algorithm for Product Disassembly Sequence Planning
,”
Neurocomputing
,
71
(
13
), pp.
2720
2726
. Artificial Neural Networks (ICANN 2006)/Engineering of Intelligent Systems (ICEIS 2006).
140.
Wang
,
J.
,
Liu
,
J.
,
Li
,
S.
, and
Zhong
,
Y.
,
2003
, “
Intelligent Selective Disassembly Using the Ant Colony Algorithm
,”
Artificial Intell. Eng. Design, Anal. Manufact.
,
17
(
4
), pp.
325
333
.
141.
Mitrouchev
,
P.
,
Wang
,
C. G.
,
Lu
,
L. X.
, and
Li
,
G. Q.
,
2015
, “
Selective Disassembly Sequence Generation Based on Lowest Level Disassembly Graph Method
,”
,
80
(
1
), pp.
141
159
.
142.
Shyamsundar
,
N.
, and
,
R.
,
1996
, “
Selective Disassembly of Virtual Prototypes
,”
1996 IEEE International Conference on Systems, Man and Cybernetics. Information Intelligence and Systems (Cat. No.96CH35929)
, Vol.
4
, pp.
3159
3164
.
143.
Smith
,
S.
, and
Chen
,
W. H.
,
2009
, “Rule-Based Recursive Selective Disassembly Sequence Planning for Green Design,”
Global Perspective for Competitive Enterprise, Economy and Ecology
,
Chou
,
S. Y.
,
Trappey
,
A. J. C.
,
Pokojski
,
J.
, and
Smith
,
S.
, eds.
Springer
,
London
, pp.
291
302
.
144.
Smith
,
S.
, and
Chen
,
W. H.
,
2012
, “
Multiple-Target Selective Disassembly Sequence Planning With Disassembly Sequence Structure Graphs
,”
International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, vol. Volume 3: 38th Design Automation Conference
, Parts A and B, pp.
1305
1314
.
145.
Smith
,
S.
,
Smith
,
G.
, and
Chen
,
W. H.
,
2012
, “
Disassembly Sequence Structure Graphs: An Optimal Approach for Multiple-Target Selective Disassembly Sequence Planning
,”
,
26
(
2
), pp.
306
316
.
146.
Srinivasan
,
H.
,
Figueroa
,
R.
, and
,
R.
,
1999
, “
Selective Disassembly for Virtual Prototyping As Applied to De-Manufacturing
,”
Rob. Comput.-Int. Manufact.
,
15
(
3
), pp.
231
245
.
147.
Srinivasan
,
H.
, and
,
R.
,
1998
, “
Complexity Reduction in Geometric Selective Disassembly Using the Wave Propagation Abstraction
,”
Proceedings of the 1998 IEEE International Conference on Robotics and Automation (Cat. No. 98CH36146)
,
Leuven, Belgium
, Vol.
2
,
IEEE
, pp.
1478
1483
.
148.
Srinivasan
,
H.
, and
,
R.
,
1999
, “
Selective Disassembly: Representation and Comparative Analysis of Wave Propagation Abstractions in Sequence Planning
,”
Proceedings of the 1999 IEEE International Symposium on Assembly and Task Planning (ISATP’99) (Cat. No.99TH8470)
, pp.
129
134
.
149.
Srinivasan
,
H.
, and
,
R.
,
2000
, “
Efficient Geometric Disassembly of Multiple Components From An Assembly Using Wave Propagation
,”
ASME J. Mech. Des.
,
122
(
2
), pp.
179
184
.
150.
Mascle
,
C.
, and
Balasoiu
,
B. A.
,
2003
, “
Algorithmic Selection of a Disassembly Sequence of a Component by a Wave Propagation Method
,”
Rob. Comput.-Int. Manufact.
,
19
(
5
), pp.
439
448
.
151.
Chung
,
C.
, and
Peng
,
Q.
,
2005
, “
An Integrated Approach to Selective-Disassembly Sequence Planning
,”
Rob. Comput. Int. Manufact.
,
21
(
4-5
), pp.
475
485
.
152.
Chung
,
C.
, and
Peng
,
Q.
,
2006
, “
Evolutionary Sequence Planning for Selective Disassembly in De-Manufacturing
,”
Int. J. Comput. Int. Manufact.
,
19
(
3
), pp.
278
286
.
153.
ElSayed
,
A.
,
Kongar
,
E.
, and
Gupta
,
S.
,
2011
, “
An Evolutionary Algorithm for Selective Disassembly of End-of-Life Products
,”
Int. J. Swarm Intell. Evol. Comput.
,
1
, p.
7
.
154.
ElSayed
,
A.
,
Kongar
,
E.
,
Gupta
,
S. M.
, and
Sobh
,
T.
,
2012
, “
A Robotic-Driven Disassembly Sequence Generator for End-Of-Life Electronic Products
,”
J. Intell. Rob. Syst.
,
68
(
1
), pp.
43
52
.
155.
Guo
,
X.
,
Zhou
,
M.
,
Liu
,
S.
, and
Qi
,
L.
,
2021
, “
Multiresource-Constrained Selective Disassembly With Maximal Profit and Minimal Energy Consumption
,”
IEEE Trans. Auto. Sci. Eng.
,
18
(
2
), pp.
804
816
.
156.
Rickli
,
J. L.
, and
Camelio
,
J. A.
,
2013
, “
Multi-Objective Partial Disassembly Optimization Based on Sequence Feasibility
,”
J. Manuf. Syst.
,
32
(
1
), pp.
281
293
.
157.
Hu
,
B.
,
Feng
,
Y.
,
Zheng
,
H.
, and
Tan
,
J.
,
2018
, “
Sequence Planning for Selective Disassembly Aiming At Reducing Energy Consumption Using a Constraints Relation Graph and Improved Ant Colony Optimization Algorithm
,”
Energies
,
11
(
8
), p.
2106
.
158.
Li
,
J. R.
,
Khoo
,
L. P.
, and
Tor
,
S. B.
,
2005
, “
An Object-Oriented Intelligent Disassembly Sequence Planner for Maintenance
,”
Comput. Indus.
,
56
(
7
), pp.
699
718
.
159.
Tian
,
G.
,
Ren
,
Y.
,
Feng
,
Y.
,
Zhou
,
M.
,
Zhang
,
H.
, and
Tan
,
J.
,
2019
, “
Modeling and Planning for Dual-Objective Selective Disassembly Using and/or Graph and Discrete Artificial Bee Colony
,”
IEEE Trans. Indus. Inform.
,
15
(
4
), pp.
2456
2468
.
160.
Jin
,
G.
,
Li
,
W.
, and
Xia
,
K.
,
2013
, “
Disassembly Matrix for Liquid Crystal Displays Televisions
,”
Proc. CIRP
,
11
, pp.
357
362
.
2nd International Through-life Engineering Services Conference
.
161.
Li
,
W.
,
Xia
,
K.
,
Gao
,
L.
, and
Chao
,
K. M.
,
2013
, “
Selective Disassembly Planning for Waste Electrical and Electronic Equipment With Case Studies on Liquid Crystaldisplays
,”
Rob. Comput.-Int. Manufact.
,
29
(
4
), pp.
248
260
.
162.
Zhang
,
X.
,
2010
, “
Object Selective Disassembly Sequence Planning for Complex Mechanical Products
,”
J. Mech. Eng.
,
46
, p.
172
.
163.
Alshibli
,
M.
,
El Sayed
,
A.
,
Kongar
,
E.
,
Sobh
,
T. M.
, and
Gupta
,
S. M.
,
2016
, “
Disassembly Sequencing Using Tabu Search
,”
J. Intell. Rob. Syst.
,
82
(
1
), pp.
69
79
.
164.
Guo
,
X.
,
Liu
,
S.
,
Zhou
,
M.
, and
Tian
,
G.
,
2016
, “
Disassembly Sequence Optimization for Large-Scale Products With Multiresource Constraints Using Scatter Search and Petri Nets
,”
IEEE Trans. Cybern.
,
46
(
11
), pp.
2435
2446
.
165.
Guo
,
X.
,
Liu
,
S.
,
Zhou
,
M.
, and
Tian
,
G.
,
2018
, “
Dual-Objective Program and Scatter Search for the Optimization of Disassembly Sequences Subject to Multiresource Constraints
,”
IEEE Trans. Auto. Sci. Eng.
,
15
(
3
), pp.
1091
1103
.
166.
Guo
,
X.
,
Zhou
,
M.
,
Liu
,
S.
, and
Qi
,
L.
,
2020
, “
Lexicographic Multiobjective Scatter Search for the Optimization of Sequence-Dependent Selective Disassembly Subject to Multiresource Constraints
,”
IEEE Trans. Cyber.
,
50
(
7
), pp.
3307
3317
.
167.
Xiwang
,
G.
,
Shixin
,
L.
,
Dazhi
,
W.
, and
Chunming
,
H.
,
2012
, “
An Improved Multi-objective Scatter Search Approach for Solving Selective Disassembly Optimization Problem
,”
Proceedings of the 31st Chinese Control Conference
, pp.
7703
7708
.
168.
Aguinaga
,
I.
,
Borro
,
D.
, and
Matey
,
L.
,
2008
, “
Parallel RRT-Based Path Planning for Selective Disassembly Planning
,”
,
36
(
11
), pp.
1221
1233
.
169.
Han
,
H. J.
,
Yu
,
J. M.
, and
Lee
,
D. H.
,
2013
, “
Mathematical Model and Solution Algorithms for Selective Disassembly Sequencing With Multiple Target Components and Sequence-dependent Setups
,”
Int. J. Prod. Res.
,
51
(
16
), pp.
4997
5010
.
170.
Ghandi
,
S.
, and
Masehian
,
E.
,
2015
, “
Review and Taxonomies of Assembly and Disassembly Path Planning Problems and Approaches
,”
Comput. Aided Des.
,
67
(
C
), pp.
58
86
.
171.
Guo
,
X.
,
Zhou
,
M.
,
Abusorrah
,
A.
,
Alsokhiry
,
F.
, and
Sedraoui
,
K.
,
2020
, “
Disassembly Sequence Planning: A Survey
,”
IEEE/CAA J. Auto. Sinica
, pp.
1
17
.
172.
Lambert
,
A.
,
1997
, “
Optimal Disassembly of Complex Products
,”
Int. J. Prod. Res.
,
35
(
9
), pp.
2509
2523
.
173.
Zhong
,
L.
,
Youchao
,
S.
,
Gabriel
,
O. E.
, and
Haiqiao
,
W.
,
2011
, “
Disassembly Sequence Planning for Maintenance Based on Metaheuristic Method
,”
Aircraft Eng. Aeros. Technol.
,
83
(
3
), pp.
138
145
.
174.
Chang
,
M.
,
Ong
,
S.
, and
Nee
,
A.
,
2017
, “
AR-Guided Product Disassembly for Maintenance and Remanufacturing
,”
Proc. CIRP
,
61
(
1
), pp.
299
304
.
175.
Álvarez
,
H.
,
Aguinaga
,
I.
, and
Borro
,
D.
,
2011
, “
Providing Guidance for Maintenance Operations Using Automatic Markerless Augmented Reality System
,”
2011 10th IEEE International Symposium on Mixed and Augmented Reality
, pp.
181
190
.
176.
Makris
,
S.
,
Pintzos
,
G.
,
Rentzos
,
L.
, and
Chryssolouris
,
G.
,
2013
, “
Assembly Support Using AR Technology Based on Automatic Sequence Generation
,”
CIRP. Ann.
,
62
(
1
), pp.
9
12
.
177.
Gungor
,
A.
, and
Gupta
,
S. M.
,
1998
, “
Disassembly Sequence Planning for Products With Defective Parts in Product Recovery
,”
Comput. Indus. Eng.
,
35
(
1-2
), pp.
161
164
.
178.
Riegel
,
J.
,
Mayer
,
W.
, and
van Havre
,
Y.
,
2002–2020
179.
MATLAB
: version 9.7.0.1296695 (R2019b).
The MathWorks Inc.
,
Natick, Massachusetts
(
2020
).
180.
Android Studio
: version 3.6.1. https://developer.android.com/studio(2013–2020).
181.
Musen
,
M. A.
,
2015
, “
The Protégé Project: a Look Back and a Look Forward
,”
AI Matters
,
1
(
4
), pp.
4
12
.
182.
Ameen
,
A.
,
Khan
,
K. U. R.
, and
Rani
,
B. P.
,
2014
, “
Reasoning in Semantic Web Using Jena
,”
Comput. Eng. Intell. Syst.
,
5
(
4
), pp.
39
47
.
183.
Vuforia
. version 9-3-3. https://developer.vuforia.com (2015–2020).
184.
185.
OpenGL ES
. version 3.1. https://www.khronos.org/opengles/(2015–2020).
186.
Sahu
,
C. K.
,
Young
,
C.
, and
Rai
,
R.
,
2020
, “
Artificial Intelligence (AI) in Augmented Reality (AR)-Assisted Manufacturing Applications: a Review
,”
Int. J. Prod. Res.
,
0
(
0
), pp.
1
57
.
187.
Kato
,
H.
,
Billinghurst
,
M.
, and
Poupyrev
,
I.
,
2020
, ARToolKit. http://www.hitl.washington.edu/artoolkit/.
188.
Wikitude GmbH
. Wikitude Augmented Reality: The World’s Leading Cross-Platform AR SDK. https://www.wikitude.com/(
2021
).
189.
Kudan Inc
. Home — Kudan. https://www.kudan.io/(
2021
).
190.
2020
).
192.
VisionStar Information Technology (Shanghai) Co. Ltd
. EasyAR—Augmented Reality & AR SDK. https://www.easyar.com(
2020
).
193.
bitstars
, DroidAR by bitstars. https://bitstars.github.io/droidar/(
2020
).
194.
Unity
, Unity Real-Time Development Platform — 3D, 2D VR & AR Engine. https://unity.com (2005–2021).
195.
Apple Inc.
, Xcode. https://developer.apple.com/xcode(2003–2021).
196.
Microsoft
, Visual Studio IDE. https://visualstudio.microsoft.com(1997–2021).