PageRenderTime 52ms CodeModel.GetById 45ms RepoModel.GetById 0ms app.codeStats 0ms

/projecte eclipse/TI/data/2011-documentos/72/2011-72-090.html
HTML | 78 lines | 64 code | 14 blank | 0 comment | 0 complexity | 32a47b3cddd062be3da1744e8149e909 MD5 | raw file
  1. <!DOCTYPE html>
  2. <html lang="en">
  3. <head>
  4. <meta charset="utf-8" />
  5. <meta name="description" content="CSDM 2011 is a workshop at WSDM 2011 examining crowdsourcing for search and data mining.">
  6. <meta name="author" content="CSDM 2011"/>
  7. <meta name="keywords" content="csdm, wsdm, crowdsourcing, search, data mining, bing" />
  8. <link rel="stylesheet" href="csdm.css" type="text/css" media="screen" />
  9. <link rel="icon" href="" type="image/x-icon" />
  10. <title>CSDM 2011 | Crowdsourcing for Search and Data Mining | a WSDM 2011 workshop</title>
  11. </head>
  12. <body>
  13. <div id="container">
  14. <div id="header">
  15. <a href=""><h1>WSDM 2011 - Hong Kong, China</h1></a>
  16. <img src="title.jpg" width="600" height="299" alt="CSDM 2011 - Crowdsourcing for Search and Data Mining" />
  17. </div>
  18. <ul id="nav">
  19. <li><a href="index.html">Overview</a></li>
  20. <li><a href="people.html">People</a></li>
  21. <li><a href="sponsors.html">Sponsors</a></li>
  22. <li><a href="papers.html">Call for Papers</a></li>
  23. <li><a href="program.html">Program</a></li>
  24. <li><a href="proceedings.html">Proceedings</a></li>
  25. </ul>
  26. <div id="main">
  27. <div id="sidebar" class="y">
  28. <h2>Email list</h2>
  29. Subscribe to Google group for workshop announcements and discussion.
  30. <form action=""><br>
  31. Email address: <input type=text name=email>
  32. <input type=submit name="sub" value="Subscribe">
  33. </form>
  34. <p></p>
  35. <a href="">Visit the CSDM 2011 Google group website</a>
  36. </div>
  37. <p><b> Wednesday February 9, 8:30am-5pm, Hong Kong</b> (see <a href="">overview</a>)
  38. <p>The advent of <a href="">crowdsourcing</a> is revolutionizing data annotation, evaluation, and other traditionally manual-labor intensive processes by dramatically reducing the time, cost, and effort involved. This in turn is driving a disruptive shift in search and data mining methodology in areas such as:</p>
  39. <ul>
  40. <li><p><strong>Evaluation</strong>: the Cranfield paradigm for search evaluation requires manually assessing document relevance to search queries. Recent work on stochastic evaluation has reduced but not removed this need for manual assessment.</p></li>
  41. <li><p><strong>Supervised Learning</strong>: while traditional costs associated with data annotation have driven recent machine learning work (e.g. Learning to Rank) toward greater use of unsupervised and semi-supervised methods, the emergence of crowdsourcing has made labeled data far easier to acquire, thereby driving a potential resurgence in fully-supervised methods.</p></li>
  42. <li><strong>Applications:</strong> Crowdsourcing has introduced exciting new opportunities to integrate human labor into automated systems: handling difficult cases where automation fails, exploiting the breadth of backgrounds, geographic dispersion, real-time response, etc.</li>
  43. </ul>
  44. CSDM 2011 will build on the recent success of the <a href="">Crowdsourcing for Search Evaluation Workshop (CSE 2010)</a> at <a href="">SIGIR 2010</a>, extending beyond search evaluation to broadly explore development and application of crowdsourcing techniques for search and data mining.
  45. <h2>Audience and Scope</h2>
  46. This workshop will bring together researchers and practitioners of crowdsourcing techniques.
  47. This meeting is intended to work as a catalyst for future collaborations, as well as
  48. one of the main forums for sharing the latest developments in this area. The workshop will
  49. provide participants an opportunity to hear about and discuss key issues such as:
  50. <ul>
  51. <li>Advantages/disadvantages of crowdsourcing vs. traditional methods</li>
  52. <li>When and how to use crowdsourcing for an experiment</li>
  53. <li>How to increase quality and throughput of crowdsourcing</li>
  54. <li>How to detect cheating and handle noise in crowdsourcing</li>
  55. <li>General guidelines and best practices of crowdsourcing experiments</li>
  56. <li>The latest improvements and current state-of-the-art crowdsourcing systems and methods</li>
  57. <li>The reach and potential of recent innovative applications in the area
  58. </ul>
  59. </div>
  60. <div id="footer">
  61. <a href="">Contact webmaster</a> | © CSDM 2011 | Photo by <a href="">Hamedog</a>, used under Creative Commons</div>
  62. </div>
  63. </body>
  64. </html>