E N
A B
LI N
G T
E C
H N
O LO
G IE
S
1 7 5 I H I G H L I G H T S 2 0 2 3
AUTHORS
A. Meunier and E. Burtin. ESRF
REFERENCES
[1] A. Meunier & E. Burtin, ESRF Highlights 2017, 158 (2017). [2] F. Rosenblatt, Psych. Rev. 65(6), 386-408 (1958) [3] https:/ www.3blue1brown.com/topics/neural-networks
image belongs to a certain class or not, as illustrated in Figure 139. The neuron is a mathematical function, applying weights, wi, to the n inputs (pixel values, xi), then summing them up (-x) and passing this sum through a nonlinear function, referenced as an activation function (A), to produce output (Y). The image belongs to the class if -x>0 (Y=A(-x)=1), or not if -x<0 (Y=A(-x)=0). The learning phase consists of fixing the output to 1 while processing a reference image identified by the user and associated with the class to be learned. The learning algorithm consists of optimising weights by minimising errors or loss from comparison between the predicted and the true outputs. In other words, the learning aim is to fix the space boundary between the two possible neuron outputs active and inactive . The classification will then consist of determining which of the two space halves the input image belongs to. The more the perceptron learns, the more likely the classification will be correct.
The FASTVAC perceptron is more complex. Indeed, more intelligence, meaning more neurons, were needed to fully dissociate the different pressure event classes. The
improved perceptron consists of 10 neurons giving 10 outputs. Each possible combination of outputs (list of activated neurons) corresponds to one of the different event classes. After one year of perceptron learning under human supervision, the results are very promising. Indeed, classification rates for the different classes are rather good, albeit with some room for improvement: the success rates given in Figure 138c are above 90% for events such as gauge malfunction, beam kill and gauge ignition, and 80% and 70% respectively for valve moving and pressure wave propagation.
Overall, the integration of the perceptron within the FASTVAC system is providing time-saving in the analysis of pressure events usually done via a non-automatic process. FASTVAC not only detects and records but now also classifies vacuum pressure events with very good results. Further developments are ongoing, adapting inputs and adding layers to the neuron network. In addition, it is thought that perceptrons could also potentially be useful for other vacuum applications such as residual gas spectra interpretation.
Fig. 139: Single neuron process illustration for supervised learning and classification.