Latest research papers computer science 2011

This paper describes preliminary results of research related to programming teaching tools. This study focuses on the key issues being highlighted in this research. Among the research questions of the study are: What are the important issues in programming teaching and learning research?

What are the methods of the research? What kind of tools involved in programming teaching and learning? What is the level of programming involved? Noah Smith designs data-driven algorithms for automated analysis of human language.

pierreducalvet.ca/94242.php

Donate to arXiv

His book, Linguistic Structure Prediction , synthesizes many statistical modeling techniques for language. He is an amateur clarinetist, tanguero , swimmer, cocktail enthusiast, and serves on the staff of two felines. For more details, see his biographical blurb or academic c. Non-convex optimization is the core algorithmic workhorse in modern machine learning.

Open Computer Science

Its practical use far exceeds our theoretical understanding of it, yet rigor is key to getting even better performance and algorithms. The tutorial will have two parts -- part one provides an overview of the exciting recent progress in rigorously understanding non-convex optimization; both first-order methods like gradient descent and its variants, and other approaches. Part two will dig deeper into specific important problem classes, and also describe several directions for future research.

The tutorial should be broadly accessible to the typical ISIT audience; any more specific background will be developed as needed. Sujay's primary research interest is the development of rigorous new methods and algorithms for machine learning, and their use in solving core problems in industry. Sujay's background is in the fields of optimization, algorithms, stochastic processes and graph theory. He has been a Visiting Scientist at Google Research in Mountainview, CA, and a quant and founding member of a portfolio management team in the statistical arbitrage hedge fund Engineers Gate.

Praneeth Netrapalli has been a researcher at Microsoft Research India since Prior to that he was a postdoc at Microsoft Research New England. His main research interests are in developing provable and efficient algorithms for various problems in machine learning using tools from optimization and probability. One of the major recent advances in theoretical machine learning is the development of efficient learning algorithms for various high-dimensional statistical models.

Site Navigation

The Achilles heel of these algorithms is the assumption that the samples are precisely generated from the model. This assumption is crucial for the performance of these algorithms: even a very small fraction of outliers can completely compromise the algorithms' behavior. Recent results in theoretical computer science have led to the development of the first computationally efficient robust estimators for a range of high-dimensional models.

The goal of this tutorial is to introduce the broad information theory community to the core insights and techniques in this area of algorithmic robust statistics, and discuss new directions and opportunities for future work.

52nd Annual IEEE Symposium on Foundations of Computer Science (FOCS 2011)

Before moving to USC, he was a faculty member at the University of Edinburgh, and prior to that he was the Simons postdoctoral fellow in theoretical computer science at the University of California, Berkeley. His research focus is on the algorithmic and statistical foundations of massive data sets. Private PIR protocols make it possible to retrieve a data item from a database without disclosing any information about the identity of the item being retrieved.

In information-theoretic k-server PIR, the database is replicated among k non-communicating servers, and each server learns nothing about the item retrieved by the user. In the original formulation of the PIR problem, the files are assumed to be a single bit and the communication complexity is measured by the total amount of communication, i.

However, the recent information-theoretic reformulation of the problem assumes the practical scenario in which the files are of arbitrarily large size. Under this setup, the number of uploaded bits can be neglected with respect to the corresponding number of downloaded bits. This reformulation of the problem introduces the rate of a PIR scheme to be the ratio between the size of the file and the total number of downloaded bits from all servers, and the supremum of achievable rates over all achievable retrieval schemes is defined as the PIR capacity.


  • messing up a college appication essay?
  • borges essay blindness;
  • Journal of Scientific Computing?
  • should iran have nuclear weapons essay.
  • thesis on race and gender.
  • essay on adlerian therapy.

Coding and information theory have been extremely successful in characterizing the capacity of PIR as well as proposing codes for capacity-achieving PIR schemes. The goal of this tutorial is to furnish the audience with significant results on the topic, both in coding and information theory, and point to fresh problem formulations and open research directions brought by this new paradigm.

Salim El Rouayheb , Rutgers University. His research interests lie in the area of information theoretic security and privacy of data in networks and distributed systems. Sennur Ulukus , University of Maryland.


  • Call for Papers (Regular Issue) | International Journal of Engineering and Advanced Technology(TM).
  • citation collection essays;
  • List of important publications in computer science.

She received her Ph. Eitan Yaakobi , Technion. He received a PhD. Paul Siegel, Prof.