Optics.org
daily coverage of the optics & photonics industry and the markets that it serves
Featured Showcases
Photonics West Showcase
Optics+Photonics Showcase
News
Menu
RAPTOR uses an attention mechanism for prioritizing nanoparticle correlations. Click to expand.
RAPTOR uses an attention mechanism for prioritizing nanoparticle correlations across pre-tamper and post-tamper samples before passing them into a residual, attention-based deep convolutional classifier. a) RAPTOR takes the top 56 nanoparticles in descending order of radii to construct the distance matrices D and D′ and radii ρ and ρ′ from the pre-tamper and post-tamper samples. b) The radii and distance matrices form the query and value embeddings of an attention mechanism. The attention mechanism is then used alongside the raw distance matrices D′ and D, the softweight matrix, and L2 matrix generated from the radii vectors for the classifier. c) The classifier uses GELU activation and attention layers before applying a kernel layer and max pool layer. Then, the output is flattened into a multi-layer perceptron to compute the final classification. Credit: Wilson et al., doi 10.1117/1.AP.6.5.056002
© 2024 SPIE Europe
Top of Page