Скачать книгу

sample size, which can help researchers to work with small samples and avoid overfitting. Overfitting occurs when the intrinsic data of a specific sample are prioritized; thus, the performance of the model for the images not belonging to the sample ends up being much lower than the performance related to the sample used for training.

      In addition to object detection in radiographic analysis, NNs have been used for automatic identification. Examples include comparison between antemortem and postmortem panoramic radiographs (human postmortem identification) [25], diagnosis of osteoporosis on panoramic radiographs [26–28], and malocclusion diagnosis [29, 30].

      Another important possibility with the use of CNNs is the automation of identification and selection of structures in CBCT DICOM images. With this method, automated mandibular canal detection [31] and automated mandible segmentation [32], an approach to dental implant planning [33], are possible.

Photo depicts screen capture of the software Hello Pearl (Los Angeles, USA) showing automatic detection of a maxillary area with bone loss. Photo depicts screen capture of the software Hello Pearl (Los Angeles, USA) showing automatic detection of a mandibular area with bone loss. Photo depicts screen capture of the software Hello Pearl (Los Angeles, USA) showing automatic detection of marginal discrepancy of a metallic restoration. Photo depicts screen capture of the software Hello Pearl (Los Angeles, USA) showing automatic detection of calculus.

      1 1 Kim, J.H., Abdala‐Júnior, R., Munhoz, L. et al. (2020). Comparison between different cone‐beam computed tomography devices in the detection of mechanically simulated peri‐implant bone defects. Imaging Sci. Dent. 50 (2): 133–139.

      2 2 Bagis, N., Kolsuz, M.E., Kursun, S., and Orhan, K. (2015). Comparison of intraoral radiography and cone‐beam computed tomography for the detection of periodontal defects: an in vitro study. BMC Oral Health 15: 64.

      3 3 Nasseh, I. and Al‐Rawi, W. (2018). Cone beam computed tomography. Dent. Clin. North Am. 62 (3): 361–391.

      4 4 Kolsuz, M.E., Bagis, N., Orhan, K. et al. (2015). Comparison of the influence of FOV sizes and different voxel resolutions for the assessment of periodontal defects. Dentomaxillofac. Radiol. 44 (7): 20150070.

      5 5 Grant, G.T., Campbell, S.D., Masri, R.M. et al. (2016). Glossary of digital dental terms: American College of Prosthodontists. J. Prosthodont. 25 (Suppl 2): S2–S9.

      6 6 Davidowitz, G. and Kotick, P.G. (2011). The use of CAD/CAM in dentistry. Dent. Clin. North Am. 55 (3): 559–570.

      7 7 Hung, K., Yeung, A.W.K., Tanaka, R., and Bornstein, M.M. (2020). Current applications, opportunities, and limitations of AI for 3D imaging in dental research and practice. Int. J. Environ. Res. Public Health 17 (12): 4424.

      8 8 Markarian, R.A., da Silva, R.L.B., Burgoa, S. et al. (2021). Clinical relevance of digital dentistry during COVID‐19 outbreak: a scoped review. Braz. J. Oral Sci. 19: e200201.

      9 9 No‐Cortes, J., Ayres, A.P., Lima, J.E. et al. (2021). Trueness, 3D deviation, time and cost comparisons between milled and 3D‐printed resin single crowns. Eur. J. Prosthodont. Restor. Dent. 29: 1–6.

      10 10 Coachman, C., Sesma, N., and Blatz, M.B. (2021). The complete digital workflow in interdisciplinary dentistry. Int. J. Esthet. Dent. 16: 1–18.

      11 11 Park, W.J. and Park, J.B. (2018). History and application of artificial neural networks in dentistry. Eur. J. Dent. 12: 594–601.

      12 12 Mupparapu, M., Wu, C.W., and Chen, Y.C. (2018). Artificial intelligence, machine learning, neural networks, and deep learning: futuristic concepts for new dental diagnosis. Quintessence Int. 49: 687–688.

      13 13 Naylor, C.D. (2018). On the prospects for a (deep) learning health care system. JAMA 320 (11): 1099–1100.

      14 14 Peng, J., Zeng, X., Townsend, J. et al. (2021). A machine learning approach to uncovering hidden utilization patterns of early childhood dental care among Medicaid‐insured children. Front. Public Health 8: 599187.

      15 15 Hornik, K. (1991). Approximation capabilities of multilayer feedforward networks. Neural Netw. 4 (2): 251–257.

      16 16 Krizhevsky, A., Sutskever, I., and Hinton, G.E. (2012). ImageNet classification with deep convolutional neural networks. Adv. Neural Inf. Proces. Syst. 25: 1–9.

      17 17 Tuzoff, D.V., Tuzova, L.N., Bornstein, M.M. et al. (2019). Tooth detection and numbering in panoramic radiographs using convolutional neural networks. Dentomaxillofac. Radiol. 48 (4): 20180051.

      18 18 Lee, J.H., Kim, D.H., Jeong, S.N., and Choi, S.H. (2018). Diagnosis and prediction of periodontally compromised teeth using a deep learning‐based convolutional neural network algorithm. J. Periodontal Implant Sci. 48 (2): 114–123.

      19 19 Lee, J.H., Kim, D.H., Jeong, S.N., and Choi, S.H. (2018). Detection and diagnosis of dental caries using a deep learning‐based convolutional neural network algorithm. J. Dent. 77: 106–111.

      20 20 Ekert, T., Krois, J., Meinhold, L. et al. (2019). Deep learning for the radiographic detection of apical lesions. J. Endod. 45 (7): 917–922.

      21 21 Ariji, Y., Yanashita, Y., Kutsuna, S. et al. (2019). Automatic detection and classification of radiolucent lesions in the mandible on panoramic radiographs using a deep learning object detection technique. Oral Surg. Oral Med. Oral Pathol. Oral Radiol. 128 (4): 424–430.

      22 22 Poedjiastoeti, W. and Suebnukarn, S. (2018). Application of convolutional neural network in the diagnosis of jaw tumors. Healthc Inform. Res. 24 (3): 236–241.

      23 23 Kositbowornchai, S., Plermkamon, S., and Tangkosol, T. (2013). Performance of an artificial neural network for vertical root fracture detection: an ex vivo study. Dent. Traumatol. 29 (2): 151–155.

      24 24

Скачать книгу