I’d do the following:
1, look at the sitemap and look for errors, if there are some then you can plan how you’ll sort them – otherwise you’re fighting in the dark!
2, Run something like Screaming Frog [free] to see what’s redirecting etc., if duplicates are being created [which hurts rankings] and a whole host of other things
3, Look at Webmaster tools, especially at the HTML improvement section, the crawl errors and then I’d use the robots.txt tester to refine my robots.txt file, so that google find the pages I want to serve and not what it thinks is the best.
4, Like Ruda suggested use the remove URLs tool on Webmaster tools to speed this up.
5, Build links to the page you want to get index / higher in SERPs