Wednesday, 19 June 2013

Big Data for AI : workshop at SGAI 2013

Co-Chair : Dr. Simon Thompson (BT Research)
Co-Chair : Dr. Dean Jones (Numero)

Invited Keynote : TBA

Big Data technology like Hadoop supports a number of novel components for handling and processing large volumes of complex, poly-structured data. For example the Hive massively parallel SQL data warehouse, HBase for parallel No-SQL and the Map-Reduce or YARN frameworks for distributed programming. Hadoop enables the application of the vast processing and storage power created by the march of Moore's law to be easily and cheaply accessed.

These components have proven useful for researchers working in AI, who have developed frameworks using this technology,  in particular for Machine Learning http://mahout.apache.org/ and Genetic Programming http://groups.csail.mit.edu/EVO-DesignOpt/groupWebSite/index.php?n=Site.FlexGP

However, traditionally AI has been plagued by computational problems in planning, vision and speech transcription (for example); are there opportunities to apply Hadoop and associated technologies in these domains?  

This workshop has three objectives :

- to provide a place where AI researchers in the UK can gain understanding of the new opportunities that this technology offers.
- to provide an opportunity for new methods of using Hadoop or other Big Data systems to be described and discussed.
- to explore new requirements and innovations that enable Hadoop or Big Data tools for AI

Contributions that address any of these three goals are sought, please email simon.2.thompson@bt.com with proposals for sessions,presentations, posters or discussions as soon as possible.

Submission deadline : October 7th 2013.

No comments:

Post a Comment