Check for inconsistencies between tags. Nothing prevents multiple sets of tags on a page either. Maybe your template added a robots meta tag for index, then a plugin had one for noindex. You can't just assume there's a tag for every item, so don't stop your search after the first one. I've seen up to four sets of robots meta tags on the same page, with three of them set to index and one set as noindex, but this noindex wins every time.
Change UA to Googlebot Sometimes you just need to see what Google sees. There are many interesting issues regarding cloaking, user redirection, and caching. You can change this with the Chrome Developer Tools (instructions here) or with a plugin like jewelry retouching service User-Agent Switcher. I would recommend if you're going to do this, that you do it in incognito mode. You want to check that Googlebot isn't being redirected somewhere - like maybe it can't see a page in another country because it's being redirected based on the US IP address to another page. Robots.txt
Check your robots.txt file for anything that might be blocked. If you block a page from crawling and place a canonical on that page to another page or a noindex tag, Google can't crawl the page and can't see those tags. Another important tip is to monitor your robots.txt file for changes. There may be someone changing something, or there may be unintended shared caching issues with a development server, or any other issue - so it's important to keep an eye out for changes to this file.