Bots could be countered for the most part by looking at past projects though. phpBB is a popular forum system that counts views and they are getting pretty accurate results after filtering out 45 different bots . NNF could easily take over this list.
The StackExchange question on this subject  is interesting to look a too. The advice their boils down to:
1. Create a user-agent blocklist, like phpBB has, but possibly use a bigger dataset such as the database by robotstxt.org  or botsvsbrowsers.com .
2. Ignore all user-agents sporting the words
4. Create a honeypot to collect more bot user-agents.
Most of this feels like it would make for an extreme bloat in the NNF source-code though.
Another problem (that I just thought of) is that NNF would need to introduce some sort of sessions system otherwise many valid users will inflate the counter as well: I will be counted when I read page 1 of a thread, again on page 2, again on page 3, then I might write a reply and after hitting the reply button I would be counted again. So even excluding bots this will be a problem.
: https://github.com/phpbb/phpbb/blob/21e55ea6d23c94cbbcd672dcf31c939dd7ef08e9/phpBB/install/install_install.php#L2102-L2148 (Latest master commit, as of today)