4.6 Article

Automated preclinical detection of mechanical pain hypersensitivity and analgesia

Journal

PAIN
Volume 163, Issue 12, Pages 2326-2336

Publisher

LIPPINCOTT WILLIAMS & WILKINS
DOI: 10.1097/j.pain.0000000000002680

Keywords

Preclinical pain models; Machine learning; Machine vision; Automated pain detection

Funding

  1. Defense Advanced Research Projects Agency [HR0011-19-2-0022]
  2. NIH NINDS [F31 NS084716-02, R35 NS105076, R01 NS089521, F31 NS108450, R01 NA114202]
  3. Bertarelli Foundation
  4. Simons Collaboration on the Global Brain
  5. NIH BRAIN Initiative [U19 NS113201, U24 NS109520, R01AT011447]
  6. Boston Children's Hospital Technology Development Fund
  7. National Council for Scientific and Technological Development (CNPq, Brazil) [229356/2013-3]

Ask authors/readers for more resources

The lack of sensitive and robust behavioral assessments of pain in preclinical models has been a major limitation for both pain research and the development of novel analgesics. Here, the authors demonstrate a novel data acquisition and analysis platform that provides automated, quantitative, and objective measures of naturalistic rodent behavior in an observer-independent and unbiased fashion. The technology records freely behaving mice, in the dark, over extended periods for continuous acquisition of 2 parallel video data streams: (1) near-infrared frustrated total internal reflection for detecting the degree, force, and timing of surface contact and (2) simultaneous ongoing video graphing of whole-body pose. Using machine vision and machine learning, they automatically extract and quantify behavioral features from these data to reveal moment-by-moment changes that capture the internal pain state of rodents in multiple pain models. They show that these voluntary pain-related behaviors are reversible by analgesics and that analgesia can be automatically and objectively differentiated from sedation. Finally, they used this approach to generate a paw luminance ratio measure that is sensitive in capturing dynamic mechanical hypersensitivity over a period and scalable for high-throughput preclinical analgesic efficacy assessment.
The lack of sensitive and robust behavioral assessments of pain in preclinical models has been a major limitation for both pain research and the development of novel analgesics. Here, we demonstrate a novel data acquisition and analysis platform that provides automated, quantitative, and objective measures of naturalistic rodent behavior in an observer-independent and unbiased fashion. The technology records freely behaving mice, in the dark, over extended periods for continuous acquisition of 2 parallel video data streams: (1) near-infrared frustrated total internal reflection for detecting the degree, force, and timing of surface contact and (2) simultaneous ongoing video graphing of whole-body pose. Using machine vision and machine learning, we automatically extract and quantify behavioral features from these data to reveal moment-by-moment changes that capture the internal pain state of rodents in multiple pain models. We show that these voluntary pain-related behaviors are reversible by analgesics and that analgesia can be automatically and objectively differentiated from sedation. Finally, we used this approach to generate a paw luminance ratio measure that is sensitive in capturing dynamic mechanical hypersensitivity over a period and scalable for high-throughput preclinical analgesic efficacy assessment.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available