/projecte eclipse/TI/data/2011-documentos/72/2011-72-090.html
HTML | 78 lines | 64 code | 14 blank | 0 comment | 0 complexity | 32a47b3cddd062be3da1744e8149e909 MD5 | raw file
- <!DOCTYPE html>
- <html lang="en">
- <head>
- <meta charset="utf-8" />
- <meta name="description" content="CSDM 2011 is a workshop at WSDM 2011 examining crowdsourcing for search and data mining.">
- <meta name="author" content="CSDM 2011"/>
- <meta name="keywords" content="csdm, wsdm, crowdsourcing, search, data mining, bing" />
- <link rel="stylesheet" href="csdm.css" type="text/css" media="screen" />
- <link rel="icon" href="http://www.wsdm2011.org/wsdm2011/lib/tpl/hongkong/images/icon.png" type="image/x-icon" />
- <title>CSDM 2011 | Crowdsourcing for Search and Data Mining | a WSDM 2011 workshop</title>
-
- </head>
-
- <body>
- <div id="container">
- <div id="header">
- <a href="http://www.wsdm2011.org/"><h1>WSDM 2011 - Hong Kong, China</h1></a>
- <img src="title.jpg" width="600" height="299" alt="CSDM 2011 - Crowdsourcing for Search and Data Mining" />
- </div>
- <ul id="nav">
- <li><a href="index.html">Overview</a></li>
- <li><a href="people.html">People</a></li>
- <li><a href="sponsors.html">Sponsors</a></li>
- <li><a href="papers.html">Call for Papers</a></li>
- <li><a href="program.html">Program</a></li>
- <li><a href="proceedings.html">Proceedings</a></li>
- </ul>
- <div id="main">
- <div id="sidebar" class="y">
- <h2>Email list</h2>
- Subscribe to Google group for workshop announcements and discussion.
- <form action="http://groups.google.com/group/csdm-2011/boxsubscribe"><br>
- Email address: <input type=text name=email>
- <input type=submit name="sub" value="Subscribe">
- </form>
- <p></p>
- <a href="http://groups.google.com/group/csdm-2011">Visit the CSDM 2011 Google group website</a>
- </div>
- <p><b> Wednesday February 9, 8:30am-5pm, Hong Kong</b> (see <a href="http://www.wsdm2011.org/wsdm2011/overview">overview</a>)
- <p>The advent of <a href="http://en.wikipedia.org/wiki/Crowdsourcing">crowdsourcing</a> is revolutionizing data annotation, evaluation, and other traditionally manual-labor intensive processes by dramatically reducing the time, cost, and effort involved. This in turn is driving a disruptive shift in search and data mining methodology in areas such as:</p>
- <ul>
- <li><p><strong>Evaluation</strong>: the Cranfield paradigm for search evaluation requires manually assessing document relevance to search queries. Recent work on stochastic evaluation has reduced but not removed this need for manual assessment.</p></li>
- <li><p><strong>Supervised Learning</strong>: while traditional costs associated with data annotation have driven recent machine learning work (e.g. Learning to Rank) toward greater use of unsupervised and semi-supervised methods, the emergence of crowdsourcing has made labeled data far easier to acquire, thereby driving a potential resurgence in fully-supervised methods.</p></li>
- <li><strong>Applications:</strong> Crowdsourcing has introduced exciting new opportunities to integrate human labor into automated systems: handling difficult cases where automation fails, exploiting the breadth of backgrounds, geographic dispersion, real-time response, etc.</li>
- </ul>
- CSDM 2011 will build on the recent success of the <a href="http://ir.ischool.utexas.edu/cse2010">Crowdsourcing for Search Evaluation Workshop (CSE 2010)</a> at <a href="http://www.sigir2010.org">SIGIR 2010</a>, extending beyond search evaluation to broadly explore development and application of crowdsourcing techniques for search and data mining.
- <h2>Audience and Scope</h2>
- This workshop will bring together researchers and practitioners of crowdsourcing techniques.
- This meeting is intended to work as a catalyst for future collaborations, as well as
- one of the main forums for sharing the latest developments in this area. The workshop will
- provide participants an opportunity to hear about and discuss key issues such as:
- <ul>
- <li>Advantages/disadvantages of crowdsourcing vs. traditional methods</li>
- <li>When and how to use crowdsourcing for an experiment</li>
- <li>How to increase quality and throughput of crowdsourcing</li>
- <li>How to detect cheating and handle noise in crowdsourcing</li>
- <li>General guidelines and best practices of crowdsourcing experiments</li>
- <li>The latest improvements and current state-of-the-art crowdsourcing systems and methods</li>
- <li>The reach and potential of recent innovative applications in the area
- </ul>
- </div>
- <div id="footer">
- <a href="mailto:nrejack@gmail.com">Contact webmaster</a> | © CSDM 2011 | Photo by <a href="http://www.flickr.com/photos/hamedog/">Hamedog</a>, used under Creative Commons</div>
- </div>
- </body>
- </html>
-