Top 10 big data and social science innovations

People look to academia as the source of innovation, and especially so in the natural and physical sciences. Researchers in biosciences, clinical medicine, physics, and chemistry have always generated new ideas for industry to capitalize on. Generally, innovations coming out of the social sciences would be assimilated into the private sector via secondments or collaborative projects, with Richard Thaler’s Behavioral Insights Team as the finest example. However, the emergence of big data and computational social science has generated a host of technologies that are either developed together with social science researchers or have clear application in the social science praxis outside academia. 

We partnered with IN-PART, an online matchmaking platform for university-industry collaboration, to find the highest-performing innovations in big data and the social sciences. Alex and his colleagues at IN-PART selected the top 10 innovations based on article reads from R&D professionals since the start of 2017. These innovations represent a cross-section of outputs of 230+ universities and academic institutes from around the world that use the platform to establish new partnerships with industry. 

IN-PART was launched in 2014 with the aim of simplifying the initial connection between teams in academia and industry to initiate conversations that lead to the commercialization of academic research. It’s a closed-loop system through which universities have their research proactively matched to R&D professionals from a global network of companies who actively look to collaborate with universities. If you’re an industry researcher, or your university has a subscription to the platform, you can set up a free account: in-part.com/register.


AI-powered predictive analytics

Researchers at the University of Waterloo have taken predictive analytics to the next level with novel software that dissects large sets of discrete data to predict responses to extraneous factors for increased accuracy in forecasting. This technique improves upon existing tools that often oversimplify problems or inconsistently evolve over time. The software has been validated in water-demand forecasting but has the potential to be applied across a variety of sectors including research, fin-tech, insurance, fraud, and e-commerce.


Read the full article on IN-PART: https://app.in-part.com/articles/R8AgOz1eNo9L

A wearable sensor to monitor the health impacts of social interactions

The effects of social interactions on health are currently tracked by self-reporting or observational exercises. In order to conduct more comprehensive assessments, researchers at Case Western Reserve University have created a multi-variable wearable socio-biosensor. Their sensor can not only monitor physical data, such as temperature, respiratory rate, and pulse, but also collects social data, such as tone and pitch in voice signals and the frequency and duration of social interactions, providing more accurate monitoring and assessment, particularly for vulnerable groups such as military veterans, children, and the elderly.  

Read the full article on IN-PART: https://app.in-part.com/articles/xkEY0XeV7bqL

Finding meaning in feedback

With apps, websites, and physical kiosks littering public spaces, we live within a constant feedback loop in which it’s easier than ever for consumers to tell companies and other service providers what we think of their outputs. But gleaning meaningful information from that feedback is another task altogether, one which researchers at the University of Waterloo’s Pattern Analysis and Machine Intelligence (PAMI) lab are seeking to address with a novel text analytics engine. This new tool can identify important phrases in text from feedback to derive meaning, with applications in sentiment analysis and enterprise content.

Read the full article on IN-PART: https://app.in-part.com/articles/eYdN1reGgzBG

Large-scale sentiment analysis

Although some methods of natural language sentiment analysis exist, they are not accurate enough to be relied on for objective data collection and analysis. To provide a comprehensive solution, researchers at Stony Brook University have developed a new method to examine sentiment lexicons through the statistical analysis of text streams. The technology is set to eliminate the need for extensive surveys and polls in market research, financial analysis, and internet search engines by accurately analyzing the reputation of people, products, companies, and more.

Read the full article on IN-PART: https://app.in-part.com/articles/AqPNZoYqgmW


Predicting native language from a reader’s gaze

Studies into cross-linguistic influence traditionally examine the writing of a multilingual individual to try and determine insights into the process of language acquisition and use. With a view to advancing the field of linguistic analysis, researchers in the Computational Psycholinguistics Group at the MIT Department of Brain and Cognitive Sciences have developed a system that can predict native language from a reader’s gaze while reading free-form English. This technology analyses eye-movement patterns with a machine-learning algorithm to derive linguistically motivated features and predict the reader’s native language.  With potential applications in forensics, advertising, and education, this work is set to provide a novel framework for linguistic research.


Read the full article on IN-PART: https://app.in-part.com/articles/xjE7rkyqN3Ry


Filtering out noise on social media

Social media is now the biggest source of real-time information for ongoing, active situations. Despite understanding this potential, current control room technologies have yet to effectively employ social media monitoring. A new system developed by researchers in the School of Computer Science and Statistics at Trinity College Dublin uses machine learning and natural language processing to filter out noise and condense social media data into relevant information on developing situations, from natural disasters to planned major events. This technology can be used to monitor any geographical region and assist with important real-time decision making. 


Read the full article on IN-PART: https://app.in-part.com/articles/9y5gxxz8gBLK

Finding causal relationships in complex networks

A vast wealth of data is continually generated in both research and commercial settings. As the scale of this data production grows, the tools required to analyze it and gain insight into causal connections must adapt and evolve in order to keep up. To this end, a new technology developed by Mount Sinai Health System researchers utilizes a unique algorithmic approach to inform users about the causality of multivariate datasets, which has already been used to identify key relationships in both biological research and financial settings, highlighting its broad industrial applicability for analyzing causation in large-scale data. 

Read the full article on IN-PART: https://app.in-part.com/articles/xjE7rkejN3Ry

Bridging the semantic gap for visual search

The search for visual content is currently heavily reliant on text searches and annotated metadata. However, due to a ‘semantic gap’ between what’s actually contained in an image and it’s descriptors, the results do not always accurately correlate with the user’s search intention. Now, researchers at the University of Kansas have developed an improved search method that analyses text and content-based image features to provide more accurate query results that are more consistent with a user’s perception. 

Read the full article on IN-PART: https://app.in-part.com/articles/rOzNY55wNMBQ

Interactive visual insights from social media

On Twitter alone, over 6,000 tweets are posted every second. Researchers in the College of Computing and Informatics at the University of North Carolina at Charlotte have created a near-real-time, visual, interactive tool for monitoring and responding to topical trends and patterns that would otherwise be hidden in the massive amounts of data generated by social media platforms. Applications for this tool are wide-ranging, from assisting public health officials during disease outbreaks to evaluations of marketing efforts by advertisers. 

Read the full article on IN-PART: https://app.in-part.com/articles/x83gvlmx7X2y

Self-learning machine learning

To effectively utilize the approach, machine learning and data mining systems need to be able to analyze, suggest, and predict new and improved algorithms based on previously implied methods. Delphi, a distributed, multi-model, self-learning platform for machine learning, developed by researchers from the  Computer Science and Artificial Intelligence Laboratory at MIT, aims to address this. The system allows old analysis methods to be optimized in a significantly shorter time than a human team would be able to by breaking down big datasets into smaller chunks to create an improved, optimally efficient meta-model. The technology can be implemented in any environment that makes use of predictive analysis and machine learning.

Read the full article on IN-PART: https://app.in-part.com/articles/YzAM7VrW7n3B



Technology summaries written by Emma Brown (1, 2, 3 & 9), Eve Satkevic (4, 8 & 10) and Joe Ferner (5, 6 & 7). Introduction and editing by Alex Stockham.


About IN-PART:

Launched in 2014, we now work with over 230 universities and research institutes worldwide, strategically matching promising academic research with relevant R&D professionals from a network of 5,500+ innovation-driven companies.

Over 6,000 new conversations have been started through the platform, 75% of which are between international partners, resulting in everything from grant funding for collaborative research, co-development projects, testing new materials and proprietary compounds, to product development, licensing deals, and long-term strategic partnerships.

Previous
Previous

Contracts & Changes: When Publishers Merge

Next
Next

Working with Your Publisher