Crowd Evolution

Crowd Evolution: Empower and Reap

Crowd Evolution Logo

Overview

In the era of Big Data, crowdsourcing has emerged as a flexible solution, which aid with the flood of data and the process of structuring it. By harnessing human intelligence from the crowd, we can steer many applications by seamlessly combining the best of both worlds: state-of the-art algorithms’ unmatched computational power integrated with humans’ cognitive abilities and intellectual insights. Doing so ultimately realizies the concept of Human Computation.

Althought Crowdsourcing offers a cheap digital and a highly flexible one, it opens door to quality risks, where unethical workers and free-riders can exploit its virtual and anonymous nature, as well as the short term contracts that are offered ( e.g. by simply submitting random answers). This violation compromises the core gain of the crowd-sourcing services. And questions regarding the reliability and usability of the attained data materializes. Accordingly rich body of research has been dedicated to ensure high quality results.

In the Crowd Evolution project we focus on two main directions:

1) Empowering the Crowd - we take a step further in ensuring quality by attempting to realize the vision of Impact sourcing. Thus creating quality measures that not only distinguish spammers, but can also identify honest low-skilled workers, who would normally be excluded and misjedged as spammers.

2) Reaping intelligent insights, instead of using the crowd for mundane jobs. This includes but is not limited toeliciting rationals and relationships between attributes within data, etc.

Posters

Contact 

This project is coordinated by Kinda El Maarry

Publications

2016
Maarry, K. E., and W. - T. Balke, "Towards an Impact-driven Quality Control Model for Imbalanced Crowdsourcing Tasks", The 17th International Conference on Web Information Systems Engineering (WISE), Shanghai, China, 11/2016. Abstract  Download: 36 - Towards an impact driven quality control model.pdf (1.48 MB)
2015
Maarry, K. E., U. Güntzer, and W. - T. Balke, "A Majority of Wrongs Doesn't Make it Right - On Crowdsourcing Quality for Skewed Domain Tasks", The 16th International Conference on Web Information Systems Engineering (WISE), Miami, Florida, 11/2015. Abstract  Download: 069 - A Majority of Wrongs Doesn't Make it Right.pdf (1.05 MB)
Maarry, K. E., U. Güntzer, and W. - T. Balke, "Realizing Impact Sourcing by Adaptive Gold Questions: A Socially Responsible Measure for Workers’ Trustworthiness", 16th International Conference on Web-Age Information Management (WAIM), Qingdao, Shandong, China, 06/2015. Abstract  Download: WAIM_141.pdf (922.09 KB)
Maarry, K. E., and W. - T. Balke, "Retaining Rough Diamonds: Towards a Fairer Elimination of Low-skilled Workers", 20th International Conference on Database Systems for Advanced Applications (DASFAA), Hanoi, Vietnam, 04/2015. Abstract  Download: DASFAA15_regular_158.pdf (1.04 MB)
Maarry, K. E., C. Lofi, and W. - T. Balke, "Crowdsourcing for Query Processing on Web Data: A Case Study on the Skyline Operator", Journal of Computing and Information Technology (CIT), vol. 23, no. 1, 03/2015. Abstract  Download: CIT_23(1)_camera-ready.pdf (571.16 KB)
2014
Lofi, C., and K. E. Maarry, "Design Patterns for Hybrid Algorithmic-Crowdsourcing Workflows", 16th IEEE Conf. on Business Informatics (CBI), Geneva, Switzerland, 07/2014. Abstract  Download: CBI2014 Fulltext (753.19 KB)
Maarry, K. E., W. - T. Balke, H. Cho, S. -won Hwang, and Y. Baba, "Skill ontology-based model for Quality Assurance in Crowdsourcing", Workshop on Uncertain and Crowdsourced Data (UnCrowd), DASFAA, Bali, Indonesia, 04/2014. Abstract  Download: DASFAA14_UnCrowd_2.pdf (100.82 KB)