You are currently browsing the monthly archive for January 2011.

Since I started my blog, I have noticed several important events in the academic.

**1, Erdos Distance Problem–a “breakthrough” in combinatorial geometry**

**Erdos Distance Problem SOLVED!**

**János Pach: Guth and Katz’s Solution of Erdős’s Distinct Distances Problem**

**The Guth-Katz bound on the Erdős distance problem**

### Guest Post: Distinct distances, and the use of continuous mathematics in discrete geometry.

**2, Hirsch conjecture**

**Francisco Santos Disproves the Hirsch Conjecture**

### Hirsch Conjecture disproved

**Efficiency of the Simplex Method: Quo vadis Hirsch conjecture?**

**3,**** Khovanov homology—Witten ‘s newpreprint Fivebranes and Knots**

### Newsflash: Witten’s new preprint

### 4, **partition numbers**

### Finite formula found for partition numbers

New math theories reveal the nature of numbers

### A suprise dimension to adding and counting

There are any of importance I have missed? Please let me know.

These days I have scanned lots of websites and resourses, which you can see from my blog pages. But now although I want to learn so much such as algebra, geometry, I have no so much time and power to handle these things. Since I have another prolem, my poor English, and the qualifying exam also is waiting for me, I have to focus on my four courses: Probability theory, Theoretical statistics, Generalized Linear Models, and Statistical Machine Learning. I have to regart the first two courses as my essential foundations and I should pay more attention to them since they are the core content of the qualify examination. Now I just want to focous my research field in the machine learning. So it’s the time for me to start my research road. Moreover, I also have to improve my English, which is also my big issue. So from now on, I will put other things aside. Focus is the first step to depth!!!

Since I have posted “Geometry and Machine learning” and “Algebra and Machine learning”, now I want to continue this series as completely as possible. I want to say something about statistics and machine learning again. The most insightful ones here I want to recommend is listed below:

Brendan O’ Connor’s **Statistics vs. Machine Learning, fight!**

Adam Klivans’s **Thoughts regarding “Is machine learning different from statistics?”**

TM Mitchell’s **The Discipline of Machine Learning**

If you have any comments or recommendation, please write them down here.

Algebraic statistics advocates the use of algebraic geometry, commutative algebra, and geometric combinatorics as tools for making statistical inferences. The starting point for this connection is the observation that most statistical models for discrete random variables are, in fact, algebraic varieties. While some of the varieties that appear are classical varieties (like Segre varieties and toric varieties), most are new, and there are many challenging open problems about the algebraic structure of these varieties.

Now there is a great program holding at the Mittag-Leffler Institute.

**Algebraic Geometry with a view towards applications**

which including a Ph.D course “Algebraic Geometry, computations and applications” co-taught by Alicia Dickenstein from Buenos Aires and Bernd Sturmfels from UC Berkeley.

There are several famous professors in this field:

Alicia Dickenstein from Buenos Aires

Bernd Sturmfels from UC Berkeley

Seth Sullivant from North Carolina State University

Risi Kondor from Center for the Mathematics of Information, Caltech

Sumio Watanabe, Ph.D. from Tokyo Institute of Technology

Caroline Uhler from Berkeley

Here is a short course on Algebraic statistics by Professor Seth Sullivant:

Algebraic Statistics Short Course

Here is the video lecture by Risi Kondor:

**Group Theory and Machine Learning**

And you can find his thesis on this topic in his homepage.

### Manifold learning and Geometry

### SoCG 2007: Geometric Views of Learning

For the talk **A Geometric perspective on machine Learning **given by Professor **Partha Niyogi (1967-2010)** , the video is here. There is another video lecture give by Partha Niyogi about Geometric Methods and Manifold Learning here.

The subject was manifold learning:

Given a collection of (unlabelled) data inhabiting some high dimensional space, can you determine whether they actually lie on some lower dimensional manifold in this space ?

This talk was a great window into an area of machine learning with strong geometric connections, an area where geometers could profitably play and make useful contributions.

Two thesis:

Group theoretical methods in machine learning

Riemannian Geometry and Statistical Machine Learning

Now it’s already passed 00:00 am, so it is Monday now. The spring semester 2011 has begun and it will be a busy semester. I will study four courses, probability theory, theoretical statistics, applied statistics methods(generalized linear method) and machine learning. I should change my attitude for taking classes. In the past, I just took classes as duties. Now I think it is wrong. I should take them as the materials that I myself want to learn for the fundations of my academic life and the following research. I should regard the instructors as secondary and myself as the cardinal factor. Fighting!

Things You Never Say to a Graduate Student

Books Every Graduate Student Should Read

Twitter, LinkedIn, and Facebook StackOverflow user lists sorted by reputation

how I learned to stop worrying and love LaTeXing in Real Time (some useful tips on live TeXing)

# 10 easy ways to fail a Ph.D.

The 5+5 Commandments of a Ph.D.

6 tips for low-cost academic blogging

The illustrated guide to a Ph.D.

3 qualities of successful Ph.D. students

The following pic is from Anton’s Home Page :

## Recent Comments