2015 RedList: Security Startups

The 2015 results of the Red List Security Startup survey is done.  I really like doing this survey.  The ability to identify early stage security vendors that are doing good things is valuable in my mind.  This year the survey went out to over 40,000 security practitioners globally.  Absolutely, the largest group to date.  Thank you for everyone who participated.

Click here for the full report

This year we see a fair amount of the same players as we did in the 2013 report.  While a significant amount of investment has been done into the security space, that lack of new labels is an interesting point.

The top 20 Security Startups

  1. Phishme
  2. OpenDNS
  3. Okta
  4. BlueBox
  5. Agari
  6. Vormetric
  7. Risk I/O
  8. Cylance
  9. AlienVault
  10. LastPass
  11. Ionic
  12. Ciphercloud
  13. HackerOne
  14. Data Theorem
  15. Skyhigh Networks
  16. ProtectWise
  17. Now Secure (viaForensics)
  18. CloudPassage
  19. CrowdStrike
  20. Norse

Unconstrained by Actual Facts

Unconstrained by Actual Facts

Over the past couple of years the conversation of CyberSecurity has exploded from a term we’ve used as practitioners to one that is top of mind of most people in the U.S.  The media frenzy, initiated by the Snowden event, has created a multi-year news cycle that continues to today.  As a result we have a lot of commentators on the threats, compromises, budgets, skills, etc that are devoid of actual facts.  I’ve taken some time to try and apply some data to this overall “Threat” landscape that is talked about for so long.  While this is clearly not "scientific" and open to debate, I feel it does bring some data to the conversation as opposed to the faceless statements that echo in industry.  So much in fact, that a resonating chamber effect is starting to take hold well outside of practitioners since the media has focused.

Read More

AV Testing in Dispute?

There was a lot of chatter last month on the validity of AV Test, Microsoft's performance, etc. The Register did a writeup on it here.I think the debates are ridiculous, personally. In the absence of any other form of remotely scientific test what can we do. Yes, there are holes in the methodology based on what wasn't caught before, etc. I have yet to see a alternative to this test today. With that, we need to acknowledge the gaps and appreciate it for what it is.

Watching TV with DHS

On Jan 17th a posting about the release of DHS's 2011 Analyst Desktop Binder was covered in a bunch of locations but it wasn't until now that I took a look. Thought I would pass it along in the event anyone wanted to read it (here). Aside from the monitored sites and keywords, noted below, a couple of other things should be noted.

1) Why are there common ID's and PW's for the systems that the analysts use? Seems to me that this would go against standard rules on attribution, etc.
2) Why are the passwords stored in the Binder for those accounts? Either they update the Binder every 90 days or no one changes the passwords. Seems like another violation.
3) I know full well the pains of managing security in a network, however, when you have to document your issues and release the Binder it might move one to fix the issues. On page 37 it shows that an invalid certificate should be "clicked through". I can only imagine it's a self signed cert.
4) Under the CyberSecurity phrases there are some really old ones in there. "Conficker", "2600", "Cain and Abel", etc. This is a shockingly old and small list of text identifiers of possible issues. I can only hope there is a much larger list that is actually monitored and kept up to date.

Keywords ca be found on page 20+