Categories
Dan Jurafsky Dependency Parsing Dragomir Radev HMM Language Modeling Machine Learning Natural Language Processing Parsing POS tagging Programming Python SMT Word Sense Disambiguation

Radev’s Coursera Introduction to Natural Language Processing – A Review

As I promised earlier, I was going to review Prof.  Dragomir Radev’s introductory class on natural language processing.   Few words about Prof. Radev: from his Wikipedia entry, Prof Radev is an award winning Professor, who co-found North American Computational Linguistics Olympiad (NACLO), which is the equivalent of USAMO in computational linguistics. He was also the coach of U.S. coach of International Language Olympiad 2011 and helped the team won several medals [1].    I think these are great contributions to the speech and language community.  In late 90s, when I was still in undergraduate, there was not much recognition of computational language processing as an important computation skill.    With competition in high-school or college level, there will be a generation of young minds who would aspire to build intelligent conversation agent, robust speech recognizer and versatile question and answering machine.   (Or else everyone would think Linux kernel is the only cool hack in town. 🙂 )

The Class

So how about the class?  I got to say I am more than surprised and happy with it.   I was searching for an intro NLP class, so the two natural choices was Prof. Jurafsky’ and Manning’ s and Prof.  Collin’s Natural Language Processing.   Both classes received great praise and comments and few of my friends recommend to take both.   Unfortunately, there was no class offering recently so I could only watch the material off-line.

Then there comes the Radev’s class,  it is as Prof. Radev explains: “more introductory” than Collin’s class and “more focused on Linguistics and resources” than Jurafsky and Manning.   So it is good for two types of learners:

  1. Those who just started out in NLP.
  2. Those who want to gather useful resources and start projects on NLP.

I belong to both types.   My job requires me to have more comprehensive knowledge of language and speech processing.

The Syllabus and The Lectures

The class itself is a brief survey of many important topics of NLP.   There are the basics:  parsing, tagging, language modeling.  There are the advanced topics such as summarization, statistical machine translation (SMT), semantic analysis and dialogue modeling.   The lectures, except occasionally mistakes, are quite well done and filled with interesting examples.

My only criticism is perhaps the length of videos, I would hope that most videos I watch would be less than 10 minutes.    That makes it easier to rotate with my other daily tasks.

The material is not too difficult to absorb for newcomers.   For starter, advanced topic such as  SMT is not covered in too much detail mathematically.  (So no need to derive EM on IBM models.)  That I think it’s quite appropriate for first time learners like me.

One more unique feature of the lectures: it fills with interesting NACLO problems.    While NACLO is more a high-school level competition, most of the problems are challenging even for experienced practitioners.  And I found them quite stimulating.

The Prerequisites and The Homework Assignments

To me, the fun part is the homework.   There were 3 of them, they focus on,

  1. Nivre’s Dependency Parser,
  2. Language Modeling and POS Tagging,
  3. Word Sense Disambiguation

All homework are based on python.   If you know what you are doing, they are not that difficult to do.   For me, I spent around 12-14 hours on each.   (Those are usually weekends.) Just like Ng’s Machine Learning class,   you need to match numbers with  the golden reference.   I think that’s the right approach to learn any machine learning task the first time.   Blindly come up with a system and hope it works never get you anywhere.

The homework does speak about an issue of the class, i.e. you do need to know the basics of Machine Learning .  Also, if you never had any programming experience would find the homework very difficult.   This probably described many linguistic students but never take any computer science classes.  [3]    You can still “power it through” and pass.  But it can be unnecessarily hard.

So I will recommend you first take the Ng’s class or perhaps the latest Machine Learning specialization from Guestrin and Fox first.   Those are the classes which would give you some basics of programming as well as basic concept of Machine Learning.

If you didn’t take any machine learning class, one way to go through more difficult classes like this is to read forum messages.   There are many nice people in the course was answering various questions.   To be frank, if the forum doesn’t exist, then it will take me around 3 times more time to finish all assignments.

Final Word

All-in-all, I highly recommend Prof. Radev’s class to anyone who is interested in NLP.    As I mentioned though, the class does require prerequisite such as basics of programming and machine learning.   So  I would recommend any learners to first take the Ng’s class before taking this one.

In any case, I want to thank Prof. Radev and all teaching staffs who prepare this wonderful course.   I also thank to many classmates who help me through the homework.

Arthur

Postscript at 2017 April

After I wrote this review, Coursera had since upgraded to the new format.  It’s a pity none of the NLP classes, including Prof. Radev’s survive.   To bad for NLP lovers!

There is also a seismic shift in the field of NLP toward deep learning. While deep learning does not dominate evaluations like in computer vision or speech recognition, it is perhaps the most actively researched direction right now.  So if you are curious about what’s new, consider to take the latest Standford cs224n 2017 or Oxford’s Deep Learning for NLP.

[1] http://www.eecs.umich.edu/eecs/about/articles/2010/Radev-Linguistics.html

[2] Week 1 Lecture 1 Introduction

[3] One anecdote:  In the forum, some students was asking why you can’t just sum all data points of a class together and pour into scikit-learn’s fit().    I don’t blame the student because she started late and lacks of prerequisite.   She later finished all assignment and I really admire her determination.

Categories
Dragon Goldman history Microsoft MMIE oop perl Python Sphinx

Apology, Updates and Misc.

There are some questions on LinkedIn about the whereabouts of this blog.   As you may notice, I haven’t done any updates for a while.   I was crazy busy by work in Voci (Good!) and many life challenges, just like everyone.    Having a lot of fun with programming, as I am working with two of my most favorite languages – C and Python.  Life is not bad at all.

My apology to all readers though, it could be tough to blog sometimes.  Hopefully, this situation will change later this year…..

Couple of worthwhile news in ASR,  Goldman-Sach won the trial in the Dragon law suit.  There is also the VB’s piece of MS doubling up speed in their recognizer.

I don’t know how to make out of the lawsuit but only feel a bit sad.  Dragon has been the homes of many elite speech programmers/developers/researchers.  Many old-timers of speech were there.   Most of them sigh about the whole L&H fiasco.   If I were them, I would feel the same too.   In fact, once you know a bit of ASR history, you would notice that the fall of L&H gave rise to one you-know-its-name player nowadays.  So in a way, the fate of two generations of ASR guys are altered.

As for the MS piece, we are following another trend these days, which is the emergence of DBN.  Is it surprising?  Probably not, it’s rather easy to speed up neural network calculation.  (Training is harder, but that’s what DBN is strong compared to previous NN approach.)

On Sphinx, I will point out one recent bug contributed by Ricky Chan, which exposed a problem in bw’s MMIE training.   I am yet to try it but I believe Nick has already incorporated into the open-source code base.

Another items which Nick has been stressing lately is to use python, instead of perl, as the scripting language of SphinxTrain.   I think that’s a good trend.  I like perl and use one-liner, map/grep type of program a lot.  Generally though, it’s hard to find a concrete coding standard for perl.   Whereas python seems to be cleaner and naturally lead to OOP.  This is an important issue – perl programmers and perl programming style seems to be spawned from many different type of languages.   The original (bad) C programmer would fondly use globals and write functions with 10 arguments.  The original C++ programmer might expect language support on OOP but find that “it is just a hash”.   These style difference could make perl training script hard to maintain.

That’s why I like python more.  Even very bad script seems to convert itself to more maintainable script.   There is also a good pathway for python/C connect.  (Cython is probably the best.)

In any case, that’s what I have this time.  I owe all of you many articles.  Let’s see if I can write some in the near future.

Arthur

Categories
DNS Dragon dragon TV Python siri

GJB Wednesday Speech-related Links/Commentaries (DragonTV, Siri vs Xiao i Robot, Coding with Voice)

Apple Appears In Court In China To Defend Against Siri Patent Infringement Claim (Techcrunch)
ZhiZhen company (智臻網絡科技) from Shanghai is suing Apple for infringing their patents.  (The original Shanghai Daily article) From the news, back in 2006, ZhiZhen has already developed the engine for Xiao i Robot (小i機械人).  A video 8 months ago (as below). 
Technically, it is quite possible that a Siri-like system can be built at 2006.  (Take a Look at Olympus/Ravenclaw.)  Of course, the Siri-like interface you see here is certainly built in the advent of smartphone (, which by my definition, after iPhone is released).   So overall speaking, it’s a bit hard to say who is right.  
Of course, when interpreting news from China, it’s tempting to use slightly different logic. In the TC article, OP (Etherington) suggested that the whole lawsuit could be state-orchestrated. It could be related to recent Beijing’s attack of Apple. 
I don’t really buy the OP’s argument, Apples is constantly sued in China (or over the world).  It is hard to link the two events together.  
This is definitely not the Siri for TV.

Oh well, Siri is not just speech recognition, there is also the smart interpretation in the sentence level: scheduling, making appointments, do the right search.   Those by themselves are challenges.    In fact, I believe Nuance only provides the ASR engine for Apple. (Can’t find the link, I read it from Matthew Siegler.)

In the scenario of TV,  what annoys users most are probably switching channels and  searching programs.  If I built a TV, I would also eliminate the any set-top boxes. (So cable companies will hate me a lot). 
With the technology profile of all big companies, Apple seems to own all technologies need.  It also takes quite a lot of design (with taste) to realize such a device. 

Using Python to code by Voice

Here is an interesting look of how ASR can be used in coding.   Some notes/highlights:

  • The speaker, Travis Rudd, had RSI 2 years ago.  After a climbing accident, He decided to code using voice instead.  Now his RSI is recovered, he claims he is still using it for 40-60%. 
  • 2000 voice commands, which are not necessarily English words.   The author used Dragonfly to control emacs in windows.
  • How does variables work?  Turns out most variables are actually English phrases. There are specific commands to get these phrases delimited by different characters. 
  • The speaker said “it’s not very hard” for others to repeat.  I believe there will be some amount of customizations.  It takes him around 3 months.  That’s pretty much how much time a solution engineer needs to take to tune an ASR system. 
  • The best language to program in voice : Lisp. 
One more thing.   Rudd also believe it will be very tough to do the same thing with CMUSphinx.  
Ah…… models, models, models. 

Earlier on Grand Janitor’s Blog

Some quick notes on what a “Good training system” should look like: (link).
GJB reaches the 100th post! (link)
Arthur
Categories
cmu sphinx multiprocessing Python

Python multiprocessing

As my readers may noticed, I haven’t updated this blog as I have pretty heavy workload. It doesn’t help that I was sick in the middle of March as well. Excuses aside though, I am happy to come back. If I couldn’t write much about Sphinx and programming, I think it’s still worth it to keep posting links.

I also come up with requests on writing more details on individual parts of Sphinx.   I love these requests so feel free to send me more.   Of course, it usually takes me some time to fully grok a certain part of Sphinx and I could describe it in an approachable way.   So before that, I could only ask for your patience.

Recently I come up with parallel processing a lot and was intrigued on how it works in the practice. In python, a natural choice is to use the library multiprocessing. So here is a simple example on how you can run multiple processes in python. It would be very useful in the modern days CPUs which has multi-cores.

Here is an example program on how that could be done:

1:  import multiprocessing  
2: import subprocess
3: jobs = []
4: for i in range (N):
5: p = multiprocessing.Process(target=process,
6: name = 'TASK' + str(i),
7: args=(i, ......
8: )
9: )
10: jobs.append(p)
11: p.start()
12: for j in jobs:
13: if j.is_alive():
14: print 'Waiting for job %s' %(j.name)
15: j.join()

The program is fairly trivial. Interesting enough, it is also quite similar to the multithreading version in python. Line 5 to 11 is where you run your task and I just wait for the tasks finished from Line 12 to 15.

It feels little bit less elegant than using Pool because it provides a waiting mechanism for the entire pool of task.  Right now, I am essentially waiting for job which is still running by the time job 1 is finished.

Is it worthwhile to go another path which is thread-based programming.  One thing I learned in this exercise is that older version of python, multi-threaded program can be paradoxically slower than the single-threaded one. (See this link from Eli Bendersky.) It could be an easier being resolved in recent python though.

Arthur

Categories
C++ DBN HMM java learning NLTK Python Ruby scipy wfst writing

Learning vs Writing

I haven’t done any serious writings for a week.  Mostly post interesting readings just to keep up the momentum.   Work is busy so I slowed down.  Another concern is what to write.   Some of the topics I have been writing such as Sphinx4 and SphinxTrain take a little bit of research to get them right.

So far I think I am on the right track.  There are not many bloggers on  speech recognition.  (Nick is an exception.)   To really increase awareness of how ASR is done in practice, blogging is a good way to go.

I also describe myself as “recovering” because there are couple of years I hadn’t seriously thought about open source Sphinx.  In fact though I was working on speech related stuffs, I didn’t spend too much time on mainstream ASR neither because my topic is too esoteric.

Not to say, there are many new technologies emerged in the last few years.   The major one I would say is the use of neural network in speech recognition.  It probably won’t replace HMM soon but it is a mainstay for many sites already.   WFST, with more tutorial type of literature available, has become more and more popular.    In programming, Python now is a mainstay plus job-proof type of language.  The many useful toolkit such as scipy, nltk by themselves deserves book-length treatment.  Java starts to be like C++, a kind of necessary evil you need to learn.  C++ has a new standard.   Ruby is huge in the Web world and by itself is fun to learn.

All of these new technologies took me back to a kind of learning mode.   So some of my articles become longer and in more detail.   For now, they probably cater to only a small group of people in the world.   But it’s okay, when you blog, you just want to build landmarks on the blogosphere.   Let people come to search for them and get benefit from it.   That’s my philosophy of going on with this site.

Arthur

Categories
C++ Management Python Scheme time

Readings from Jan 19, 2012

C spiral law of parsing

Python vs Scheme

How do you manage geeks?

Things you know about time.

Arthur

Categories
chess conditional random field Harvard log MIT Python Readings

Readings for Jan 14, 2013

Approximation relating lg, ln, and log10 by John Cook
Digits in Power of 2 by John Cook
Rise and Fall of Third Normal Form by John Cook
The Crown Game Affair : Fascinating account on how cheating in chess was detected and caught.
Introduction to Conditional Random Field : Eerily similar to HMM, that’s perhaps why there was many “cute” ideas published on it in the past.
Stuff Harvard People Like by Ed Checn: Entertaining post.  I do know some people from MIT and the study fits to the stereotype I recognize.   For Harvard, not as much,  some are really mellow fellows.   Also, not every one in a school is savvy in computers.   So the samples Ed Chen collected may or may not be representative. (To his credit, he named all his assumptions in his post.)

Arthur

Categories
pygame Python

Installation of Python and Pygames

I was teaching my little brother on how to make a game.  Pygames naturally come to my mind as it is pretty easy to understand and program.

I have tried to use pygames on Ubuntu and Windows.  Both are fine.  On windows though, I found that using installers for both python and pygame is the simplest.  I was using python 2.7.  If you had installed pygame 1.7 or earlier, make sure you remove the pygame directory under existing installation before you install.

Arthur

Categories
C++ java perl programming languages Python Thought

Some Reflections on Programming Languages

This is actually a self-criticizing piece.  Oh well, but call it reflection doesn’t hurt.

When I first started out in speech recognition, I have a notion that C++ is the best language in the world.  For daily work? “Unix commands such as cut, split work well. ”  To take care of most of my processing issues, I used some badly written bash shell.  Around the middle of the grad school, I started to learn that perl is lovely for string processing.   Then I thought perl is the best language in the world, except it is a bit slow.

After C++ and perl, I then learned C, Java, Python.  A little bit of objective-C and sampled many other languages.   For now, I will settle on C and Perl are probably the two languages I am most proficient.  I also tend to like them the most.   There is one difference between me and the twenty-something me though – instead of arguing which language is the best, I will simply go to learn more about any programming language in the world.

Take C as an example, many would praise it to be the procedure language which is closest to the machine.  I love to use C and write a lot of my algorithms in C.  But when you need to maintain and extend a C codebase, it is a source of a pain because, there is no inherent inheritance mechanism to work with, so a programmer needs to implement their own class-implementation.  Many function pointers.  There is also no memory-checking, so an extra step of memory checking is necessary.  Debugging is also a special skill.

Take perl.  It is very useful in text processing and has very flexible syntax.   But this flexibility also makes perl script hard to read sometimes.    For example, for a loop, do you want to implement it as a foreach-loop or by a map?   Those confuse lesser programmers.  Also, when you try to maintain large scale project with perl, many programmers remark to me OOP in perl seems to “just organize the code better”.

How about C++?  We love the templates, we love the structure.   In practice though, the standard changes all the time.  Most house fixes the compiler version to make sure their C++ source code compiled.

How about Java?  There is memory boundary checking.  After a year or two on a dot-com, I also learned that Tomcat servlet is a thing in web development.   It is also easy to learn and one mainstream programming language taught in school these days.  Those I dig.  What’s the problem? You may say speed is an issue.  Wrong.  Many Java code can be optimized such that it is as fast as its C or C++ codebase.   The issue in practice is that the process of bytecode conversion is non-trivial to many.  That is why it raises doubts in a software team on whether the language is the cause of speed issues.  

For me, I also care about the fate of Java as an open language after Oracle bought Sun Microsystem.

How about Python?  I guess this is a language I know least about.  So far, it seems to take care of a lot of problems in perl. I found the regular expression takes some time to learn.  Though other than that, the language is quite easy to learn and quite nice to maintain.  I guess the only thing I would say it is the slight difference between different Python 2.X starts to annoy me.

I guess a more important point here:  every language has its strength and weakness.  In real life, you probably need to prepare to write the same algorithm in all languages you know.   So there is no room for you to say “Hey! Programming language A is better than programming language B. Wahahaha.  Since I am using A rather than B, I rock, you suck!”  No, rather you got to accept that writing in unfamiliar language is essential for tech person’s life.

I learned this through my spiritual predecessor, Eric Thayer, who organized the source code of SphinxTrain.  He once said to me, (I rephrase here,) “Arguing about programming languages is one of the most stupidest thing in the world.”

Those words enlightened me.

Perhaps that is why I have been reading “C Programming a Modern Approach”, “The C++ Programming Language”,  “Java in a Nutshell”, “Programming Perl” and “Programming Python” from time to time because I never feel satisfy with my skills on any of them.  I hope to learn D and Go soon and make sure I am proficient in Objective-C soon.  It will take me a lifetime to learn them, but on something deep like programming, learning, other than arguing, seems to be a better strategy to go.

Arthur