When is it ok to look at ladies’ breasts?
When you’re part of a study being carried out by plastic and reconstructive surgery experts[1] of course!
A Polish research team led by plastic surgeon Dr. Piotr Pietruski has enlisted the input of 100 volunteers – 50 male and 50 female – to examine a selection of computer-generated images (CGIs) of ladies’ breasts. The volunteers’ area of gaze was captured using eye-tracking technology to ascertain which parts of the breasts they looked at most often and for longest.
The study follows on from previous similar research[2] published in 2018 in the US. Results from both experiments showed that observers focused most (52-57% of their time) on the nipple and lower breast area, with little differentiation between male and female participants. While the studies don’t offer insight into why certain shapes and sizes are deemed more attractive, surgeons expect to use the information gleaned to assist in creating more aesthetically-pleasing breasts in cosmetic surgery and reconstructive operations.
It’s not the only area that this computer vision technology has been used to support surgical research. A study in 2015[3] used eye-tracking to confirm that patients with clefts drew attention to their upper lip and nose from onlookers. And a more recent paper[4] used the technology to give insights into how observers from different race backgrounds viewed the results of cosmetic nose jobs.
A further study published in 2018[5] used eye-tracking technology to analyse how a human observer processed the visual information provided by medical imaging. One finding of this research was that, surprisingly, the longer a specialist examined a mammogram, the more likely it was that abnormalities would be missed. Not so surprising was the conclusion that experts were better at identifying abnormalities than novices and could do so faster with generally lower saccadic rates.
Also exciting for the field of breast surgery is the announcement that Google Health’s AI is now more successful than human radiologists at reading mammograms. When faced with the X-rays of nearly 29,000 British and over 90,000 American women, the AI model correctly identified cancer in more cases than a single doctor had been able to do (the practice used currently in the US) and matched the identification rate of the double-reading system now used in the UK which requires two specialists to analyze the images. The algorithm reduced the number of false-positive results by up to 1.2% and false -negative results by up to 2.7% compared to the current UK method, and by 5.7% and 9.4% respectively compared to US data.
These stories show the benefits that computer vision can bring to breast health. If years of experience and practice bring higher levels of correctly interpreting medical images, then it stands to reason that a deep learning algorithm could learn these skills in a fraction of the time and be put to work reading mammograms to a high level of success. But remember we still need the algorithms to be written, so it’s not all over to the computers just yet!
Active Silicon manufactures a range of components specifically for medical imaging, from embedded systems used in ophthalmic surgery and radiation therapy, to long-reach video cameras for live-streaming outside the operating theater. View our product range and get in touch to understand more about how we’re supporting modern medical innovation. You can also read more about eye-tracking applications in our recent blog.
[1] https://journals.lww.com/plasreconsurg/Fulltext/2019/12000/Analysis_of_the_Visual_Perception_of_Female_Breast.1.aspx
[2] https://journals.lww.com/plasreconsurg/Abstract/2018/03000/Where_Do_We_Look__Assessing_Gaze_Patterns_in.4.aspx
[3] https://www.researchgate.net/publication/281863707_Eye_tracker_based_study_Perception_of_faces_with_a_cleft_lip_and_nose_deformity
[4] https://jamanetwork.com/journals/jamafacialplasticsurgery/fullarticle/2720056
[5] https://www.researchgate.net/publication/326046602_State_of_the_Art_Eye-Tracking_Studies_in_Medical_Imaging