Artificial Intelligence Research
https://www.sciedu.ca/journal/index.php/air
<p><img style="float: right; padding-left: 20px; padding-right: 20px;" src="/journal/public/site/images/air/air.jpg" alt="" width="300" /><em><span style="font-family: Calibri, sans-serif;">Artificial Intelligence Research </span></em>(<strong>AIR</strong>)<span style="font-family: Calibri, sans-serif;"> is a peer-reviewed, international scientific journal providing a forum for original research, reviews, experience exchange or conference reports related to the fields of Artificial Intelligence and Applications for researchers, programmers, software and hardware manufacturers.</span></p><p style="font-family: Calibri, sans-serif;">Topics of interest include but are not limited to the following: Machine learning, Pattern recognition, Computer vision, Expert systems, Robotics, Artificial neural network, Genetic algorithm, Natural language processing, Automated reasoning, Intelligent search engine, Complex systems, Data Mining, Intelligent control and so on.</p><p style="font-family: Calibri, sans-serif;">The journal is striving to achieve high quality through double-blind peer review, as specified in <a href="/journal/index.php/air/about/submissions#authorGuidelines">Author Guidelines</a>.</p><p style="font-family: Calibri, sans-serif;"><strong>AIR</strong> is included in:</p><ul style="font-family: Calibri, sans-serif;"><li><a style="font-size: 10px;" href="https://scholar.google.com/citations?hl=en&pli=1&user=LeVd49sAAAAJ">Google Scholar Citations</a></li><li>ProQuest</li><li>LOCKSS</li><li>PKP Open Archives Harvester</li><li>SHERPA/RoMEO</li><li>The Standard Periodical Directory</li><li>Academic Resource Index</li><li>CiteFactor</li></ul><p>The journal is published in both print and online versions. The online version is free to access and download.</p><p>To facilitate rapid publication and minimize administrative costs, the Journal accepts <a href="/journal/index.php/air/about/submissions"><strong>Online submission</strong></a> and <strong><a href="mailto:air@sciedupress.com">Email submission</a></strong>. For online submission, please register and then follow the instructions given.</p><p><strong>AIR’s Sections:</strong></p><p>Original Research, Conference Report, Experience Exchange, Reviews</p>Sciedu Pressen-USArtificial Intelligence Research1927-6974<p>Submission of an article implies that the work described has not been published previously (except in the form of an abstract or as part of a published lecture or academic thesis), that it is not under consideration for publication elsewhere, that its publication is approved by all authors and tacitly or explicitly by the responsible authorities where the work was carried out, and that, if accepted, will not be published elsewhere in the same form, in English or in any other language, without the written consent of the Publisher. The Editors reserve the right to edit or otherwise alter all contributions, but authors will receive proofs for approval before publication.</p><p>Copyrights for articles published in our journals are retained by the authors, with first publication rights granted to the journal. The journal/publisher is not responsible for subsequent uses of the work. It is the author's responsibility to bring an infringement action if so desired by the author.</p>Classification of Echocardiogram View using A Convolutional Neural Network
https://www.sciedu.ca/journal/index.php/air/article/view/20733
<p>The standard views in echocardiography capture distinct slices of the heart which can be used to assess cardiac function. Determining the view of a given echocardiogram is the first step for analysis. To automate this step, a deep network of the ResNet-18 architecture was used to classify between six standard views. The network parameters were pre-trained with the ImageNet database and prediction quality was assessed with a visualization tool known as gradient-weighted class activation mapping (Grad-CAM). The network was able to distinguish between three parasternal short axis views and three apical views to ~99\% accuracy. 10-fold cross validation showed a 97\%-98\% accuracy for the apical view subcategories (which included apical two-, three-, and four- chamber views). Grad-CAM images of these views highlighted features that were similar to those used by experts in manual classification. Parasternal short axis subcategories (which included apex level, mitral valve level, and papillary muscle level) had accuracies of 54\%-73\%. Grad-CAM images illustrate that the network classifies most parasternal short axis views as belonging to the papillary muscle level. Likely more images and incorporating time-dependent features would increase the parasternal short axis view accuracy. Overall, a convolutional neural network can be used to reliably classify echocardiogram views.</p>Hannah OrnsteinDan Adam
Copyright (c) 2021 Artificial Intelligence Research
2021-09-142021-09-14111110.5430/air.v11n1p1