The Earphone Project pilot: a tele-otology study for remote Aboriginal communities
Original Article

The Earphone Project pilot: a tele-otology study for remote Aboriginal communities

Alexander J. Saxby1,2 ORCID logo, Daniel Schofield1, Fiona Tout1, Joseph Gordon3, Misha M. Verkerk2, Timothy Watson1, Nicholas Jufas1,4,5, Jonathan H. K. Kong1,2,5, Nirmal Patel1,4,5, Katrina Ward3, Judy Caswell3,6,7

1Faculty of Medicine, University of Sydney, Sydney, Australia; 2Department of Otolaryngology, Royal Prince Alfred Hospital, Sydney, Australia; 3Brewarrina Aboriginal Medical Service, Brewarrina, Australia; 4Department of Otolaryngology, Royal North Shore Hospital, Sydney, Australia; 5Faculty of Medicine, Macquarie University, Sydney, Australia; 6Bourke Aboriginal Medical Service, Bourke, Australia; 7Western NSW Local Health District, Brewarrina, Australia

Contributions: (I) Conception and design: AJ Saxby, D Schofield, F Tout; (II) Administrative support: K Ward, J Caswell; (III) Provision of study materials or patients: AJ Saxby; (IV) Collection and assembly of data: AJ Saxby, D Schofield, F Tout, J Gordon, MM Verkerk, T Watson, N Jufas, JHK Kong, N Patel; (V) Data analysis and interpretation: AJ Saxby, D Schofield, F Tout, J Gordon, MM Verkerk, T Watson, N Jufas, JHK Kong, N Patel; (VI) Manuscript writing: All authors; (VII) Final approval of manuscript: All authors.

Correspondence to: A/Prof. Alexander J. Saxby, MB, BChir, MA (Cantab.), FRACS. Department of Otolaryngology, Royal Prince Alfred Hospital, 50 Missenden Road, Camperdown, Sydney, NSW 2050, New South Wales, Australia; Faculty of Medicine, University of Sydney, Sydney, Australia. Email: alex.saxby@sydney.edu.au.

Background: The Earphone Project aims to create a suite of tele-otology tools that enable an Aboriginal Health Worker in a remote setting to perform an otological assessment of a patient including an automated audiogram, tympanometry, patient history questionnaire and video otoscopy.

Methods: This prospective pilot study enrolled patients over four years of age presenting with an ear complaint to Ear, Nose and Throat (ENT) clinics from June 2019 and May 2021. Patients were assessed with three different automated audiometers, two different video otoscopes and one handheld tympanometer plus a tablet based multiple choice medical history questionnaire. Device performance was assessed, and the effectiveness of the study protocol was evaluated. Outcome measures included assessment of image quality, patient experience and audiometry and tympanometry concordance with gold standard audiologist-led assessments as well as concordance of the tele-otology diagnosis with a face-to-face clinical diagnosis by a senior otolaryngologist. This was aimed at creating a robust protocol to take through to a larger study where the primary outcome measure would be diagnostic concordance.

Results: Tele-assessment was successful in determining an abnormal from normal ear with an overall accuracy of 78.4% giving it a sensitivity, specificity, positive and negative predictive value of 81.3%, 73.1%, 84.8%, 67.9% respectively. All three automated audiometers met the standard required with a high concordance to gold standard audiometry. The HearTest phone-based audiometer was considered the best performing audiometer with the highest accuracy (89.1%). The HearScope video otoscope produced the best quality of image and had a superior image-capture system. The handheld Amplivox tympanometer had excellent concordance with gold standard tympanometry. The history questionnaire was considered appropriate and adequate to the needs of teleassessment. Patient feedback was overall extremely positive.

Conclusions: This pilot study enabled selection of which telehealth tools were the most suitable for use in the future main Aboriginal community study. The study design proved to be robust and fit for purpose. Valuable conclusions were drawn to optimize the main study to follow.

Keywords: Telehealth; tele-otology; automated audiology; screening; indigenous health


Received: 10 December 2023; Accepted: 24 July 2024; Published online: 03 September 2024.

doi: 10.21037/ajo-23-59


Video S1 Video file example of Cupris TYM video otoscopy.
Video S2 Video file example of HearScope video otoscopy.

Introduction

Automated audiometry and video otoscopy have enabled clinicians to diagnose and manage patients with ear disease and hearing loss in remote and rural communities around the world (1-5). In Australia, such telehealth strategies have become increasingly important as part of an alternative health care model in Aboriginal and Torres Strait Islander communities with limited access to Ear, Nose and Throat (ENT) specialists and audiologists (2,6-8).

Aboriginal & Torres Strait Islander children have the highest rates of otitis media in the world and are twice as likely to have long term hearing loss as non-Aboriginal children, with subsequent detrimental effects on speech and language development, numeracy and literacy rates, employment and risk of incarceration (9). The high prevalence of ear disease in rural Australia has been declared a public health emergency requiring immediate intervention according to the World Health Organisation (9,10).

Research efforts and interventions dedicated to “closing the gap” in ear disease and hearing loss in these communities were the subject of a 2017 report by the Standing Committee on Health, Aged Care and Sport to the Australian Government. Amongst a suite of recommendations to prevent and manage ear disease in these communities, the use of telemedicine and tele-audiology were highlighted and greater availability and training in video-otoscopy in rural primary care was recommended (9).

Telehealth strategies have been shown to play a role in increasing screening rates for ear disease and reducing waiting times for review by an ENT specialist (5,8). Smartphone otoscopes can record images of comparable diagnostic utility to conventional otoscopy (11-16) and several studies have demonstrated inter-rater agreement to diagnose ear disease using specific devices that share diagnostic information between rural and remote healthcare units and tertiary ENT specialist centres (2,12,17).

Automated audiometry refers to hearing tests which are self-administered from the point the test begins rather than being manually administered by a trained hearing professional. In the last decade, automated hearing assessments, including those using machine learning techniques, have demonstrated similar accuracy, reliability and time efficiency as manual hearing assessments and have the potential to provide an inexpensive, accessible option for hearing loss diagnosis within a telehealth context (18).

However, telehealth models for ear and hearing health are not without limitations. Such novel technologies require rigorous evaluation in their intended setting before widespread adoption. The additional time taken to evaluate telehealth images may introduce unnecessary delays to referral (2). Inter and intra-rater agreement may vary depending on the subtype of ear disease encountered (19). Images captured by video or photo-otoscopy can be limited in use if wax obscures the view of the tympanic membrane (TM) and ear canal and static images may be more difficult to interpret by off-site clinicians than video (20). Other key barriers to implementation include adequacy of initial training and ongoing education in the use of a specific device, usability and cultural acceptability, cost, internet coverage and infrastructure for maintenance and repair (9).

The purpose of this pilot study was to compare different available telehealth devices in a model of care linking rural communities in Australia with a tertiary ENT specialist centre. Whilst previous studies have compared individual video otoscopes and automated audiometers to their gold standard counterparts, there is a paucity of evidence comparing different telehealth devices to one another. Prior to a larger study using standardised devices, the aim was to compare devices in terms of their diagnostic accuracy, image quality, ease of use, cost and feedback from clinicians and patients. We present this article in accordance with the STARD reporting checklist (available at https://www.theajo.com/article/view/10.21037/ajo-23-59/rc).


Methods

The study was conducted in accordance with the Declaration of Helsinki (as revised in 2013). It was approved by the Aboriginal Health & Medical Research Council of NSW (AHMRC) (HREC Ref: 1540/19) as well as the Sydney Local Health District Ethics Review Committee (Project No. X19-0042 & 2019/ETH00242). Written informed consent was obtained from the patients or their parents/guardians prior to participation in the study and participation was entirely voluntary.

This prospective study was conducted between June 2019 and May 2021, enrolling patients over four years of age presenting with an ear complaint to a public tertiary referral hospital ENT Clinic or the private clinical office of the lead author (A.J.S.). The study design algorithm is shown in Figure 1. Patient assessment involved sequential assessment using the telehealth tools followed by a standard clinical assessment with an ENT surgeon. The research component was divided into a digital history questionnaire, video otoscopy, tympanometry and automated audiometry. The clinical component was performed by a single senior ENT surgeon (A.J.S.) blinded to the results of the preceding research assessment, with a gold standard audiological assessment performed by a qualified audiologist. The gold standard audiograms were not performed by a single audiologist, but rather whomever was on duty on that day. However, all were senior fully qualified audiologists, following standard audiological protocols. This pilot study took place during the global coronavirus disease 2019 (COVID-19) pandemic and therefore outreach trips to remote Aboriginal communities were paused. During this period approval was gained through the ethics committee to shift the study to city based ENT clinics. Given the primary purpose of this pilot study was to assess the merit of the various tele-health tools, this adjustment was deemed acceptable, in the knowledge that the main study would proceed as planned in the remote communities at a later stage, following resolution of the pandemic.

Figure 1 Study protocol flow diagram.

Patient history questionnaire (PHQ)

An intuitive digital PHQ was created on an online platform (REDCap Research Electronic Data Capture Program), whereby responses would prompt further questions related to their positive symptoms whereas negative responses would terminate further questioning in that field. The full questionnaire is available in Appendix 1. It was adapted with the author’s permission from a verified questionnaire used in a study of smartphone otoscopy in Nepal (13) and later Cambodia (4) to screen for ear disease. It was designed to simulate an otological history taken by an ENT surgeon.

Video otoscopy

Two video otoscopy devices were selected, approved for use by the Australian Therapeutic Goods Administration (TGA): The TYM Smartphone Otoscope (Cupris Ltd., Somerset, UK) and the HearSCOPE Digital Video Otoscope (HearX Group, Pretoria, South Africa) (Figure 2). Selection criteria was a video system which was approved for clinical use, readily available, easy to use and with the ability to export their results for electronic distribution. A 5–10 second duration video was recorded of each ear canal by the research assistant, who had been trained on how to use the device.

Figure 2 Picture of the two video otoscopes trialled in the pilot study. Left: Cupris TYM video otoscope. Right: HearX HearSCOPE video otoscope. The Cupris TYM system uses an adapter which mounts onto a specialised phone case, utilising the light and camera of the smartphone. The HearScope system plugs into the smartphone and has a roller switch adjustable light intensity on the wire plus a focus control wheel on a separate handheld camera stylus extension.

Automated audiometry and tympanometry

We reviewed several potential tools for the automated audiometry component. Selection was based on TGA approval, portability, ease of use, data exportability and cost. We also considered their operating platform resulting in a final selection of the tablet-based ShoeBox Audiometer (ShoeBox Ltd., Ottawa, Canada), the computer-based Electronica 800M Air conduction Screening Audiometer (Electronica Technologies, France) and the smartphone-based HearX HearTEST Audiometer (HearX Group, Pretoria, South Africa). All had supplied and calibrated over-the-ear headphones. All systems assessed air conduction pure tone thresholds at 500, 1,000, 2,000, and 4,000 Hz without masking or bone conduction capabilities. The research assistant would explain the test to the patient and apply the headphones, then the program (ShoeBox, Electronica or HearTest) would run automatically, cycling through the four pure tone frequencies and adjusting decibel output in line with their responses to determine the patient’s threshold.

For tele-tympanometry, the Amplivox Otowave 102 Tympanometer (Amplivox Ltd., Birmingham, UK) was selected as it was approved for use in Australia by the TGA, easy to use, widely available, highly portable with printable results for uploading online.

Patient allocation

Allocation to one of the three automated audiometers and one of the two video otoscopes was sequential rather than formally randomised, with an aim to achieve equal number of patients assessed with each system.

Online data collection

Each participant had their PHQ, otoscopy video, audiograms and tympanograms uploaded deidentified with a unique participant ID to a secure online database (Sydney University REDCap Research Education Data Capture Program).

Standard clinical consultation and audiological assessment

Following the telehealth data acquisition, participants were reviewed by a single senior ENT surgeon (A.J.S.) who conducted a history and examination, including standard clinic equipment such as microscopy, microsuction and pneumotoscopy where required.

The patient also went through a standard audiologist led audiological assessment in a sound-proof booth using standard bone and air conduction protocols and masking where needed. Tympanometry was also performed by the audiologist. This served as the “gold standard” with which to compare the researcher performed tele-tympanometry and automated audiograms.

Data analysis

  • Patient experience was gauged through a feedback questionnaire (Appendix 2).
  • Image quality of the two video otoscopes was assessed with a subjective 5-point grading scale. Four ENT surgeons (A.J.S., N.J., N.P., J.H.K.K.) were asked to grade a set of 40 images (20 of each system, presented in random order) based on their perception of the quality of the image on a 5-point scale (0 worst, 5 best) blinded to which system was used and to any clinical information. Comparison of the pooled results was made using a paired t test.
  • Field of view comparisons were made between video otoscopy and onsite clinical otoscopy by recording how many quadrants of the TM were visible in each case.
  • Automated audiograms were compared to the gold standard audiologist performed audiograms using Bland-Altman plots (21) which graphically displays deviation from the gold standard result in decibels against the mean of the hearing threshold for the two audiograms at each frequency tested. Sensitivity and specificity were calculated for whether the automated audiometers were able to detect hearing loss, defined as >25 dBHL threshold, compared to the gold standard audiogram, at all frequencies. From this an accuracy calculation was derived using the formula: True positives + True negatives/Total number of cases.
  • Comparison was made between the standard audiologist performed tympanometry result versus the result achieved with the tele-tympanometer. Cohen’s Kappa (k) was used to gauge agreement between the two tympanometer results.
  • Clinical diagnostic concordance: the onsite ENT surgeon (A.J.S.) recorded a diagnosis and management decision based on the clinical history, examination of the patient and the gold standard audiogram and tympanogram. Three offsite ENT surgeons (N.J., N.P., J.H.K.K.) recorded their proposed diagnosis and management plan based on their interpretation of the telehealth recorded data (automated audiogram, tele-tympanometry, video otoscopy and PHQ). Comparison was then made of these two clinical assessments for diagnostic concordance. For the purpose of the pilot study, diagnosis was characterised into normal or abnormal rather than specific clinical entities which will be explored more widely in the main study. Note was made of what diagnosis had been missed when discrepancies arose.

Results

Forty-eight patients (96 ears) were enrolled in the pilot study; 23 females, 25 males with a mean age of 50.4 years (range, 4–84 years). The cohort was predominantly adults with only five participants under the age of 18 years. Complete data sets were achieved for 36 patients (75%) with all incomplete datasets only missing one piece of information (three missing tympanograms, one missing automated audiogram, seven missing gold standard audiograms, one missing video otoscope data).

Audiology assessment

Figure 3 shows graphical representation of the accuracy of the automated audiometers compared to the gold standard audiogram at different frequencies. Across a range of frequencies and with differing levels of hearing loss, the automated systems performed well in all cases. For each system, a fair spread of different hearing loss was assessed. The graphs show how all three systems were better able to match the gold standard when evaluating higher frequencies compared to lower frequencies. The laptop computer based Electronica system showed the most variance from the gold standard at low frequency assessment. Overall, all systems had the ability to detect hearing loss and would be suitable for a telehealth system, however the HearTEST system gave the best performance. Table 1 shows the sensitivity, specificity and overall accuracy of each automated audiometer tool using pooled data from each frequency tested (500 Hz, 1 KHz, 2 KHz and 4 KHz). The HearTEST system also benefited from a lower cost per audiogram which strengthened its choice as automated audiometer for the main study.

Figure 3 Bland-Altman plots showing comparison of the three automated audiometers at each frequency. Difference in hearing threshold is plotted against the average threshold to show how performance differed at different patient hearing levels. Each data point represents a different patient. Perfect concordance would plot all results along the central zero line indicating that whatever the level of patient hearing loss, the result of the automated audiogram was the same as the gold standard.

Table 1

Sensitivity and specificity of each automated audiometer examining ability to detect hearing loss (defined as a hearing threshold of >25 dB) using a reference gold standard audiogram, performed by an audiologist

Automated audiometer
Electronica (Laptop based) ShoeBox (Tablet based) HearTEST (Phone based)
Sensitivity (%) 88.5 71.1 85.3
Specificity (%) 76.9 92.0 93.3
Accuracy (%) 82.7 83.0 89.1

Tympanometry

Tympanometry data showed a high degree of concordance between the handheld tele-device and the benchtop gold standard tympanometer, with agreement in 92.2% of cases, giving a Cohen’s Kappa of k=0.84, classified as near-perfect agreement. The accuracy was deemed high enough to warrant ongoing use in the main study.

Video otoscopy

The two video otoscopy systems were quite different in their handling. The Cupris TYM system is directly coupled to the smartphone camera and light source through a model specific case whereas the HearScope system is plugged into the smartphone via a cable and has a separate stylus camera with adjustable light intensity and focus bar (Figure 2). Having a screen separated from the camera allowed more freedom of movement and also permitted the patient to see what was being filmed which was positively commented on in patient feedback. Angulation of the speculum within the canal to gain visualization of the drum was straightforward with both systems. Videos S1,S2 contains video examples of each system. A criticism of the Cupris TYM system was its reliance on a specific brand and model of phone for coupling to the light source and camera, whereas the HearScope system was more versatile being compatible with any model of phone or tablet.

In the majority of cases the video otoscopes captured fewer quadrants of the TM when compared to gold standard assessments. Gold standard assessments achieved a full view of the TM in 80.5% of cases, whereas this was only accomplished in 11.7% of the video-otoscopy assessments.

Subjective rating scores on a 5-point scale for image quality of the two video otoscopes showed a superior clarity in the HearSCOPE system with an average rating of 3.1 compared to 2.4 for Cupris TYM (P=0.06) (see Figure 4). As a result, the HearScope system was proposed to be taken forward to the main study. There was added benefit that the HearSCOPE video otoscope is integrated in the same unit as the HearTEST audiometer which was also the preferred hearing assessment tool.

Figure 4 Subjective image quality assessment of video otoscopy. Quality of the image graded on a 5-point scale (0 worst, 5 best) for a representative sample of each system blinded to their clinical information.

Clinical diagnosis

Tele-assessment diagnosis was in agreement with the onsite “gold standard” diagnosis in 78.4% of cases, giving a sensitivity, specificity, positive and negative predictive value of 81.3%, 73.1%, 84.8%, 67.9% respectively. Of the 17 cases where diagnoses differed, ten were considered normal on the tele-assessment but abnormal in the on-site assessment (labelled “missed diagnoses”) and seven were classed as abnormal in the tele-assessment but normal when seen by the onsite clinician (labelled “false diagnoses”) (Table 2). In terms of which tele tools were used in the cases where a misdiagnosis was recorded, a much high proportion had used the Cupris TYM video-otoscope (82%, see Table 3). Conversely, which audiometer was used had no apparent impact on diagnostic failures.

Table 2

Disagreement in diagnoses between offsite tele-interpretation compared to gold standard on-site in person clinical consultation

Missed diagnosis (N=10) False diagnosis (N=7)
Tinnitus (n=3) Glue ear (n=3)
Glue ear (n=2) Exostoses (n=2)
Wax impaction (n=2) Sensorineural hearing loss (n=2)
Perforation (n=1)
Conductive hearing loss (n=1)
Vertigo (n=1)

“Missed diagnosis” refers to cases where tele-assessment recorded a normal diagnosis, whereas onsite interpretation found an abnormality. “False diagnosis” refers to cases where the onsite assessment was normal, but the tele-interpretation registered an abnormality.

Table 3

Video otoscope use for the 17 cases where onsite and offsite diagnosis differed

Otoscope Missed diagnosis False diagnosis Total misdiagnoses
Cupris TYM 8 (80%) 6 (86%) 14 (82%)
HearScope 2 (20%) 1 (14%) 3 (18%)

Data are presented as n (%).

Patient experience

The patient feedback was very positive. On a five-point scale grading comfort from “Excellent” [5] to “Very Poor” [1], all devices had a mean rating >4. In terms of video otoscopy, the HearSCOPE device was found to be better tolerated by patients with a mean comfort score of 4.8 compared to 4.0 (P=0.01). There was no statistical difference between the comfort scores for any of the audiometers, automated or gold-standard (P=0.27). When asked to rate their experience with the ENT surgeon, a mean of 4.8 was reported.


Discussion

The primary purpose of this pilot study was to assess a number of potential telehealth tools grouped as a composite telehealth ear assessment to be used in a remote health setting. Based on our findings, the authors decided to proceed with the following devices:

  • HearX HearScope for video otoscopy;
  • HearX HearTEST for automated audiology;
  • Amplivox Otowave Tympanometer.

The pilot project unmasked several advantages and disadvantages of each device and enabled the most appropriate to be chosen for the main study. From this several important conclusions were drawn which may prove useful to those seeking to set up a telehealth system of their own.

Hearing assessment

The study indicated that all three of the trialled automated audiometers performed well with close correlation to the gold standard audiograms. Performance was independent of the patient’s level of hearing loss and across the range of frequencies tested. Whilst the HearTEST device showed slightly closer correlation to the gold standard audiometer, all devices were deemed fit for purpose in a telehealth setting. However, the additional portability and low cost of the HearTEST device made it better suited to a teleheath application than its comparators.

One major limitation in the use of automated audiology is the lack of both masking and bone conduction testing. Diagnosis of conductive hearing loss versus sensorineural hearing loss is not possible. However, for the purpose of a screening tool in a remote telehealth environment, the assessment of normal versus abnormal hearing and the severity of that loss is what is paramount, as this helps guide the decision to refer the patient to a specialist centre.

The benefit of an automated system to telehealth applications is the ability to screen hearing in the absence of a qualified audiologist or trained staff member. Access to quality hearing testing is challenging in rural and remote communities, not only due to the lack of qualified or trained personnel but also due to lack of equipment and appropriate sound insulated environments in which to perform the testing. A portable system that tests hearing in an automated manner without the need for prior audiological training helps to bridge this gap. It is not a substitute for formal audiological testing but offers a screening alternative that helps to triage those that require more detailed testing.

Tympanometry was well accomplished by the handheld device tested in this pilot study and correlated with the gold standard benchtop alternative. Where results differed, it was deemed due to canal obstruction such as wax giving a falsely flat result on the tele-tympanometry, not seen on the gold standard tympanometry due to canal toileting during the intervening ENT surgeon consultation. There may also have been slight differences of opinion, such as a flat curve being interpreted by one ENT surgeon as “Type As”, but by another as “Type B”. The handheld nature of the Amplivox Tympanometer makes it ideal for use in the main study alongside the tablet based other screening tools.

Video otoscopy

There is a plethora of devices on the market that enable recording of images of the ear canal and TM. This study was limited to those that had been approved for medical purposes through the relevant regulatory authority (The Australian Therapeutic Goods Administration).

The HearScope system outperformed the Cupris TYM in a number of ways. The recorded images were clearer on subjective assessment and appeared to result in less misdiagnosis. Having a camera that could connect to any device was more useful than one which attaches specifically to one model of smartphone. The ability to focus the image and change light intensity were additional favourable characteristics.

The camera systems struggled to capture the same proportion of the TM compared to an otologist using a handheld otoscope or microscope. This is perhaps not surprising given the difference in years of experience between the operators; one an experienced ENT surgeon and the other an investigator with limited exposure to using an otoscope. However, if a telehealth system of this nature is to be successful it must accommodate differing levels of expertise and still give sufficient visualisation of the ear canal and TM for offsite interpretation. This highlights the importance of adequate training in the use of any telehealth system with local operators prior to implementation.

Any video otoscopy system will be hindered by obstruction within the ear canal, inhibiting visualisation of the TM. This accounts for some of the poor performance of the systems in this pilot. When setting up telehealth clinics, it is therefore also important to upskill onsite practitioners in ear toileting techniques such as tissue spears, wax curettage or betadine irrigation.

History questionnaire

The PHQ used in this study was modified from one verified in a previous study and shown to be effective in screening for otologic pathology (4,13). Whilst this cannot replace the dynamic and interactive history taking of a medical consultation, no concerns were raised by any of the offsite clinicians in terms of missing critical information. However, those administering the questionnaire noted that the wording of the questions was at times too long and complex. The authors felt that it would be possible to simplify the language for future studies whilst maintaining clinical utility.

Clinical diagnosis

The ability of the offsite clinicians to correctly differentiate normal from abnormal ears using telehealth information was good. However, some important pathology was missed. Inner ear pathology (vertigo, tinnitus) was less well assessed by our telehealth assessments. Several of the other misdiagnoses were probably due to variations in clinical judgement of clinical importance, rather than a fault of the telehealth system. For example, one surgeon might comment that wax was the primary problem, whilst another felt it was present but not of clinical concern. More concerning were cases where the misdiagnosis was a middle ear effusion, a clinically important diagnosis in any remote screening program. This was missed in two cases and falsely diagnosed in three others. The misinterpretation appears to have been due to discrepancies or absence of tympanometry data, which highlights the importance of that particular data field in any telehealth system.

The larger main study, using just one otoscope and automated audiometer, will be better powered to address diagnosis concordance. Ultimately the ability for the telehealth system to accurately diagnose a patient that requires intervention is critical. What this pilot aimed to discover was which of the trialled devices would likely make for the best telehealth system to take through to that larger study. Misdiagnosis did not seem correlated to which audiometer was used but was related to which video otoscopy system was involved. This highlights how important visual information is to the offsite interpretation of pathology, in conjunction with the other tele-information.

Limitations of the study

Allocation of devices was made on a sequential basis rather than randomisation which may have had implications in terms of learning curve and training level but it was considered such effects would be minimal.

The pilot study mainly recruited adult patients, but in the main study it is expected there will be a higher proportion of children. It is possible that conclusions drawn in adults will not be as applicable in the paediatric setting but this was considered unlikely. This pilot looked at quality of image capture and accuracy in comparison to gold standards so presumably these would be attributes still preferable in the assessment of a child’s ear. No specific problems were encountered in any of the paediatric patients recruited to the pilot study.

The study was not an exhaustive review of all telehealth tools available and new technologies are constantly being released which may exceed the capabilities of those tested. In terms of study design, having the patient undergo telehealth assessment followed by otolaryngologist consultation may have resulted in more focused or different history responses, but examination findings should not have varied. Fatigue, particularly in paediatric patients, may have impacted on testing aspects requiring patient concentration, such as the audiological assessment, but the close correlation of the audiograms suggests this was not a significant problem.

Future considerations for the main study

An optimal telehealth tool is one which is portable, simple to use, reliable and accurate. Going into the project, it was assumed that a phone-based system might be the most useful but in fact the screen size made reading of the history questionnaire difficult and uploading of data more problematic. The laptop computer offered the most high-powered and sophisticated operating system, but in this case was deemed too complicated for someone, with limited exposure to be able to pick up and use effectively. The tablet option was therefore chosen as the most ideal platform to set up the main study, offering a good-sized screen for viewing data, completing tasks and also being very portable and lightweight.

Ideally, all of the various components of the telehealth screening tool should be incorporated in the one portable unit. The interactive questionnaire, camera system and automated audiometer can easily be integrated onto a tablet, but the tympanometer would still need to be a separate entity. Measurement of impedance requires more than electronic drivers, needing specialised pressure transducers that cannot be combined into a standard tablet. Therefore, for the main study the telehealth assessment system will include two main pieces of equipment; a tablet connected to specialised headphones and attached video otoscope plus a separate handheld audiometer. The authors feel this still represents a very portable system for use in a remote setting.


Conclusions

Telehealth assessment of ear pathology is possible through a combined system of history taking, video otoscopy, tympanometry and automated audiometry all of which should be easily managed by local care givers. This pilot study has addressed which tools will be taken forward to a main study in the remote setting to test this hypothesis.


Acknowledgments

The authors would like to thank the following people who were involved in a) protocol preparation, b) clinical implementation or c) advisory capacity: Alice Gordonb, Ben Dixonb, Bianca Cochrane-Owersb, Boe Rambaldinic, Daniel Kellyc, Dee Anna Nixonb,, Elizabeth Harropa, Heather Finlaysonc, Helen Fergusonc, James Schuster-Brucea, Julie Knightc, Keahly Abbottb, Kelvin Konga,c, Kylie Gwynnea, Mahmood Bhuttaa,c , Mary Jonesb,c, Melanie Birda,b, Vita Christiec.

Funding: This work was supported by a financial grant from the Sydney Local Health District, NSW Government, Australia awarded at their 2019 “Pitch” program.


Footnote

Reporting Checklist: The authors have completed the STARD reporting checklist. Available at https://www.theajo.com/article/view/10.21037/10.21037/ajo-23-59/rc

Data Sharing Statement: Available at https://www.theajo.com/article/view/10.21037/ajo-23-59/dss

Peer Review File: Available at https://www.theajo.com/article/view/10.21037/ajo-23-59/prf

Conflicts of Interest: All authors have completed the ICMJE uniform disclosure form (available at https://www.theajo.com/article/view/10.21037/ajo-23-59/coif). A.J.S. serves as an unpaid editorial board member of Australian Journal of Otolaryngology from January 2019 to December 2024. The other authors have no conflicts of interest to declare.

Ethical Statement: The authors are accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved. The study was conducted in accordance with the Declaration of Helsinki (as revised in 2013). It was approved by the Aboriginal Health & Medical Research Council of NSW (AHMRC) (HREC Ref: 1540/19) as well as the Sydney Local Health District Ethics Review Committee (Project No. X19-0042 & 2019/ETH00242). Written informed consent was obtained from the patients or their parents/guardians prior to participation in the study and participation was entirely voluntary.

Open Access Statement: This is an Open Access article distributed in accordance with the Creative Commons Attribution-NonCommercial-NoDerivs 4.0 International License (CC BY-NC-ND 4.0), which permits the non-commercial replication and distribution of the article with the strict proviso that no changes or edits are made and the original work is properly cited (including links to both the formal publication through the relevant DOI and the license). See: https://creativecommons.org/licenses/by-nc-nd/4.0/.


References

  1. Robler SK, Platt A, Turner EL, et al. Telemedicine Referral to Improve Access to Specialty Care for Preschool Children in Rural Alaska: A Cluster-Randomized Controlled Trial. Ear Hear 2023;44:1311-21. [Crossref] [PubMed]
  2. Habib AR, Perry C, Crossland G, et al. Inter-rater agreement between 13 otolaryngologists to diagnose otitis media in Aboriginal and Torres Strait Islander children using a telehealth approach. Int J Pediatr Otorhinolaryngol 2023;168:111494. [Crossref] [PubMed]
  3. Alenezi EM, Veselinović T, Tao KF, et al. Ear Portal: An urban-based ear, nose, and throat, and audiology referral telehealth portal to improve access to specialist ear-health services for children. J Telemed Telecare 2023; Epub ahead of print. [Crossref] [PubMed]
  4. Schuster-Bruce J, Shetty P, O’Donovan J, et al. Comparative performance of prediction model, non-expert and telediagnosis of common external and middle ear disease using a patient cohort from Cambodia that included one hundred and thirty-eight ears. Clin Otolaryngol 2021;46:635-41. [Crossref] [PubMed]
  5. Ravi P, Ramkumar V, Rajendran A, et al. Tele-Audiological Surveillance of Middle Ear Status among Individuals with Cleft Lip and/or Palate in Rural South India. J Am Acad Audiol 2020;31:185-94. [Crossref] [PubMed]
  6. Elliott G, Smith AC, Bensink ME, et al. The feasibility of a community-based mobile telehealth screening service for Aboriginal and Torres Strait Islander children in Australia. Telemed J E Health 2010;16:950-6. [Crossref] [PubMed]
  7. Nguyen KH, Smith AC, Armfield NR, et al. Cost-Effectiveness Analysis of a Mobile Ear Screening and Surveillance Service versus an Outreach Screening, Surveillance and Surgical Service for Indigenous Children in Australia. PLoS One 2015;10:e0138369. [Crossref] [PubMed]
  8. Reeve C, Thomas A, Mossenson A, et al. Evaluation of an ear health pathway in remote communities: improvements in ear health access. Aust J Rural Health 2014;22:127-32. [Crossref] [PubMed]
  9. Leach A & Morris P. Otitis media and hearing loss among Aboriginal and Torres Strait Islander children: a research summary. Australian Parliament’s Standing Committee on Health, Aged Care and Sport public hearing in reference to the Inquiry into the Hearing Health and Wellbeing of Australia. June 7th 2017. Available online: https://www.aph.gov.au/DocumentStore.ashx?id=10703288-84e1-4581-84c2-2fc851a3d5a8&subId=511822
  10. World Health Organization. Chronic suppurative otitis media: burden of illness and management options. World Health Organization. 2004. Available online: https://iris.who.int/handle/10665/42941
  11. Biagio L. Asynchronous video-otoscopy with a telehealth facilitator. Telemed J E Health 2013;19:252-8. [Crossref] [PubMed]
  12. Biagio L. Video-otoscopy recordings for diagnosis of childhood ear disease using telehealth at primary health care level. J Telemed Telecare 2014;20:300-6. [Crossref] [PubMed]
  13. Mandavia R, Lapa T, Smith M, et al. A cross-sectional evaluation of the validity of a smartphone otoscopy device in screening for ear disease in Nepal. Clin Otolaryngol 2018;43:31-8. [Crossref] [PubMed]
  14. Mousseau S, Lapointe A, Gravel J. Diagnosing acute otitis media using a smartphone otoscope; a randomized controlled trial. Am J Emerg Med 2018;36:1796-801. [Crossref] [PubMed]
  15. Rappaport KM, McCracken CC, Beniflah J, et al. Assessment of a Smartphone Otoscope Device for the Diagnosis and Management of Otitis Media. Clin Pediatr (Phila) 2016;55:800-10. [Crossref] [PubMed]
  16. Moshtaghi O, Sahyouni R, Haidar YM, et al. Smartphone-Enabled Otoscopy in Neurotology/Otology. Otolaryngol Head Neck Surg 2017;156:554-8. [Crossref] [PubMed]
  17. Alenezi EMA, Jajko K, Reid A, et al. The reliability of video otoscopy recordings and still images in the asynchronous diagnosis of middle-ear disease. Int J Audiol 2022;61:917-23. [Crossref] [PubMed]
  18. Wasmann JW, Pragt L, Eikelboom R, et al. Digital Approaches to Automated and Machine Learning Assessments of Hearing: Scoping Review. J Med Internet Res 2022;24:e32581. [Crossref] [PubMed]
  19. Eikelboom RH, Mbao MN, Coates HL, et al. Validation of tele-otology to diagnose ear disease in children. Int J Pediatr Otorhinolaryngol 2005;69:739-44. [Crossref] [PubMed]
  20. Moberly AC, Zhang M, Yu L, et al. Digital otoscopy versus microscopy: How correct and confident are ear experts in their diagnoses? J Telemed Telecare 2018;24:453-9. [Crossref] [PubMed]
  21. Thompson GP, Sladen DP, Borst BJ, et al. Accuracy of a Tablet Audiometer for Measuring Behavioral Hearing Thresholds in a Clinical Population. Otolaryngol Head Neck Surg 2015;153:838-42. [Crossref] [PubMed]
doi: 10.21037/ajo-23-59
Cite this article as: Saxby AJ, Schofield D, Tout F, Gordon J, Verkerk MM, Watson T, Jufas N, Kong JHK, Patel N, Ward K, Caswell J. The Earphone Project pilot: a tele-otology study for remote Aboriginal communities. Aust J Otolaryngol 2024;7:38.

Download Citation