Journal
PAIN
Volume 163, Issue 12, Pages 2326-2336Publisher
LIPPINCOTT WILLIAMS & WILKINS
DOI: 10.1097/j.pain.0000000000002680
Keywords
Preclinical pain models; Machine learning; Machine vision; Automated pain detection
Categories
Funding
- Defense Advanced Research Projects Agency [HR0011-19-2-0022]
- NIH NINDS [F31 NS084716-02, R35 NS105076, R01 NS089521, F31 NS108450, R01 NA114202]
- Bertarelli Foundation
- Simons Collaboration on the Global Brain
- NIH BRAIN Initiative [U19 NS113201, U24 NS109520, R01AT011447]
- Boston Children's Hospital Technology Development Fund
- National Council for Scientific and Technological Development (CNPq, Brazil) [229356/2013-3]
Ask authors/readers for more resources
The lack of sensitive and robust behavioral assessments of pain in preclinical models has been a major limitation for both pain research and the development of novel analgesics. Here, the authors demonstrate a novel data acquisition and analysis platform that provides automated, quantitative, and objective measures of naturalistic rodent behavior in an observer-independent and unbiased fashion. The technology records freely behaving mice, in the dark, over extended periods for continuous acquisition of 2 parallel video data streams: (1) near-infrared frustrated total internal reflection for detecting the degree, force, and timing of surface contact and (2) simultaneous ongoing video graphing of whole-body pose. Using machine vision and machine learning, they automatically extract and quantify behavioral features from these data to reveal moment-by-moment changes that capture the internal pain state of rodents in multiple pain models. They show that these voluntary pain-related behaviors are reversible by analgesics and that analgesia can be automatically and objectively differentiated from sedation. Finally, they used this approach to generate a paw luminance ratio measure that is sensitive in capturing dynamic mechanical hypersensitivity over a period and scalable for high-throughput preclinical analgesic efficacy assessment.
The lack of sensitive and robust behavioral assessments of pain in preclinical models has been a major limitation for both pain research and the development of novel analgesics. Here, we demonstrate a novel data acquisition and analysis platform that provides automated, quantitative, and objective measures of naturalistic rodent behavior in an observer-independent and unbiased fashion. The technology records freely behaving mice, in the dark, over extended periods for continuous acquisition of 2 parallel video data streams: (1) near-infrared frustrated total internal reflection for detecting the degree, force, and timing of surface contact and (2) simultaneous ongoing video graphing of whole-body pose. Using machine vision and machine learning, we automatically extract and quantify behavioral features from these data to reveal moment-by-moment changes that capture the internal pain state of rodents in multiple pain models. We show that these voluntary pain-related behaviors are reversible by analgesics and that analgesia can be automatically and objectively differentiated from sedation. Finally, we used this approach to generate a paw luminance ratio measure that is sensitive in capturing dynamic mechanical hypersensitivity over a period and scalable for high-throughput preclinical analgesic efficacy assessment.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available