Por favor, use este identificador para citar o enlazar este ítem: https://hdl.handle.net/10495/42049
Registro completo de metadatos
Campo DC Valor Lengua/Idioma
dc.contributor.authorPabón Vidal, Adriana Lucía-
dc.contributor.authorGarcía Sucerquia, Jorge Iván-
dc.contributor.authorGómez Ramírez, Alejandra-
dc.contributor.authorHerrera Ramírez, Jorge Alexis-
dc.contributor.authorBuitrago Duque, Carlos Andrés-
dc.contributor.authorLopera Acosta, María Josef-
dc.contributor.authorMontoya, Manuel-
dc.contributor.authorTrujillo Anaya, Carlos Alejandro-
dc.date.accessioned2024-09-12T00:02:33Z-
dc.date.available2024-09-12T00:02:33Z-
dc.date.issued2023-
dc.identifier.issn0143-8166-
dc.identifier.urihttps://hdl.handle.net/10495/42049-
dc.description.abstractABSTRACT: This paper reports on a convolutional neural network (CNN) – based regression model, called FocusNET, to predict the accurate reconstruction distance of raw holograms in Digital Lensless Holographic Microscopy (DLHM). This proposal provides a physical-mathematical formulation to extend its use to different DLHM setups than the optical and geometrical conditions utilized for recording the training dataset; this unique feature is tested by applying the proposal to holograms of diverse samples recorded with different DLHM setups. Additionally, a comparison between FocusNET and conventional autofocusing methods in terms of processing times and accuracy is provided. Although the proposed method predicts reconstruction distances with approximately 54 µm standard deviation, accurate information about the samples in the validation dataset is still retrieved. When compared to a method that utilizes a stack of reconstructions to find the best focal plane, FocusNET performs 600 times faster, as no hologram reconstruction is needed. When implemented in batches, the network can achieve up to a 1200-fold reduction in processing time, depending on the number of holograms to be processed. The training and validation datasets, and the code implementations, are hosted on a public GitHub repository that can be freely accessed.spa
dc.format.extent10 páginasspa
dc.format.mimetypeapplication/pdfspa
dc.language.isoengspa
dc.publisherElsevierspa
dc.type.hasversioninfo:eu-repo/semantics/publishedVersionspa
dc.rightsinfo:eu-repo/semantics/openAccessspa
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/2.5/co/*
dc.titleFocusNET: An autofocusing learning‐based model for digital lensless holographic microscopyspa
dc.typeinfo:eu-repo/semantics/articlespa
dc.publisher.groupGrupo Malariaspa
dc.identifier.doi10.1016/j.optlaseng.2023.107546-
oaire.versionhttp://purl.org/coar/version/c_970fb48d4fbd8a85spa
dc.rights.accessrightshttp://purl.org/coar/access_right/c_abf2spa
dc.identifier.eissn1873-0302-
oaire.citationtitleOptics and Lasers in Engineeringspa
oaire.citationstartpage1spa
oaire.citationendpage10spa
oaire.citationvolume165spa
dc.rights.creativecommonshttps://creativecommons.org/licenses/by-nc-nd/4.0/spa
dc.publisher.placeLondres, Inglaterraspa
dc.type.coarhttp://purl.org/coar/resource_type/c_2df8fbb1spa
dc.type.redcolhttps://purl.org/redcol/resource_type/ARTspa
dc.type.localArtículo de investigaciónspa
dc.subject.decsAprendizaje Profundo-
dc.subject.decsDeep Learning-
dc.subject.decsMicroscopía-
dc.subject.decsMicroscopy-
dc.description.researchgroupidCOL0007524spa
dc.subject.meshurihttps://id.nlm.nih.gov/mesh/D000077321-
dc.subject.meshurihttps://id.nlm.nih.gov/mesh/D008853-
dc.relation.ispartofjournalabbrevOpt. Lasers Eng.spa
Aparece en las colecciones: Artículos de Revista en Ciencias Médicas

Ficheros en este ítem:
Fichero Descripción Tamaño Formato  
PabonAdriana_2023_FocusNET_Lensless_Microscopy.pdfArtículo de investigación2.87 MBAdobe PDFVisualizar/Abrir


Este ítem está sujeto a una licencia Creative Commons Licencia Creative Commons Creative Commons