ericburgerCyber Threat Intelligence (CTI) Gets Boost From Research At Georgetown University: Dr. Eric Burger of Georgetown University provides insightful views on what is happening in the CTI arena, areas of research his team is conducting, and challenges still facing CTI adoption and standards support in this recent interview with ActiveCyber.

I recently trekked down to Georgetown University to meet with Dr. Eric Burger whose team of students and colleagues is conducting research in a variety of areas related to cyber threat intelligence. Eric had graciously accepted my invitation for an interview and enthusiastically laid out the research goals and progress his team is making. Besides the technology area, Eric’s team is also focused on policy and economics of cyber threat sharing – areas which are instrumental to the overall success of cyber threat intelligence as a viable market for vendors and for the overall effectiveness of intel-based defenses. So please read on and learn more about this important research and Eric’s views on the current status and future of cyber threat intelligence sharing and where emerging standards such as STIX/TAXII are headed. And a shout out to Jeremy W for the lead!

Spotlight on Dr. Eric Burger, Georgetown University

» Title: Research Professor of Computer Science at Georgetown University in Washington, D.C.
» Email:
» Website:
» LinkedIn:
Read his bio below.

September 28, 2015

Chris Daly, ActiveCyber: Recently thousands of websites that run the content management system WordPress were hijacked by hackers to infect unsuspecting visitors with malware exploits. How could cyber threat intelligence (CTI) have helped mitigate this problem and other problems of this nature?

Dr. Eric Burger: In two ways. First, it would get the message out to people who host WordPress what to look for and how to remediate the holes. Second, it would give people secondary indicators of compromise, as the malware itself can be scanned for.

Daly: What are your key focus areas related to CTI?

Burger: At Georgetown we are focused on removing the barriers to cyber threat intelligence information sharing. Our specific focus areas are technology, policy, law, and economics. As for technology, we are helping industry sort through the various CTI sharing technologies, including working on developing the underlying mechanisms where there are gaps, so that industry only needs to implement a single or very small set of technologies. We are looking at the legal aspects for business-to-business sharing: what can enterprises do out of the box, as well as what kinds of legal agreements need to be in place to protect those sending, receiving, and about whom the data is about. From a policy perspective, we are looking at law and case law that can be models for enhancing information sharing as well as barriers to information sharing.

Daly: What should be the CIO/CISO priorities related to CTI? Interoperability of CTI stacks? Does having more than one CTI source really add significant benefit?

Burger: The industry is still very young. There are just a handful of CTI vendors offering either STIX/TAXII or IODEF/RID stacks, and even those stacks are still evolving. Today’s CIO/CISO priority needs to be sure their vendors have a clear roadmap to supporting the generation and consumption of a standard data feed.

Having multiple CTI sources is critical. For example, no vendor of cyber security information has up-to-the-minute information on all attacks and attack vectors. It is true there is a lot of overlap, but even the major vendors of threat data find they have significant numbers of unique identifiers exchanged on a daily basis.

Daly: There seems to be a flood of CTI vendors on the market today. What criteria do you think would help CISOs separate the wheat from the chaff?

Burger: Many of the CTI vendors have a great idea in theory, that everybody can be sensors for everybody else. What the CISO needs to look at is whether the particular vendor has a critical mass of sensors. More importantly, for most enterprises, a raw data feed of threats is useless. Unless the enterprise is in the threat intelligence business, their focus needs to be on running their enterprise, not untangling intricate threats. As such, the CISO needs to make sure their vendors provide relevant scoring and actionable remediation proposals. Note this is different than the oft quoted “actionable intelligence.” All enterprises are different, and what can be highly actionable intelligence for one enterprise can be utterly irrelevant for another. Moreover, even if the intelligence is actionable, the enterprise needs to figure out what that action needs to be. If the CISO can find a vendor to help with that latter part, they have a very valuable partner in their CTI vendor.

Daly: How is STIX/TAXII making a difference in CTI? What is the rate of adoption by CTI vendors? What are some barriers to greater adoption?

Burger: STIX/TAXII is making a tangible difference for CTI. Specifically, the amount of outreach performed by DHS and the vendor community has raised the importance of interoperable feeds to the CIO and CISO level. As such, even with an expected rate of adoption by CTI vendors (very slow at this point), the value of STIX and TAXII cannot be understated.

The barriers are technical (STIX is incredibly complex and very easy to get wrong), economic (for many enterprises, the whole information sharing exercise has not received a meaningful business case study), and legal (it is not clear what the liabilities are for receiving and sending information are).

Daly: Do you believe a single data format and exchange standard such as STIX/TAXII will take hold in the CTI space? What role should government play in CTI standards adoption?

Burger: Yes and no: there will always be point solutions for tightly integrated verticals. An example of this is the use of NMSG in the domain name community. Given the years of deployment experience with NMSG and the amount of tooling available, it will be quite a while before STIX replaces NMSG. On the other hand, as there will be more and more network-level integration of feeds, the industry must coalesce on a single interchange format.

An analogy from the networking world would be that there used to be DECnet, IPX, SNA, and IP, all of which could run over Ethernet. Today, DECnet, IPX, and SNA are all tunneled by IP for the few remaining legacy applications using those legacy protocols. IP became the universal inter-networking protocol. One can see a similar role for STIX for packaging and TAXII for transport.

Daly: Is the current STIX/CybOX data model overkill for what enterprises and vendors really need?
Burger: Today, yes. People are still looking to simply automate CSV or PDF lists of IP addresses and domains to block. However, implementing STIX and CybOX means the enterprise can put into place programs to automate the mundane cyber protection activities, leaving personnel to start looking at the overall situation, instead of spending all of their time putting out fires.

Daly: Given your background in database theory, what are the pros and cons of the STIX/CybOX data model? How well suited is the data model and XML schema to other uses as users move beyond just threat exchange to things like machine learning, predictive analysis, finding attack correlations, etc? Ultimately do you believe STIX will survive with XML as the data format approach or should STIX standards and development efforts convert to other approaches?

Burger: There are pros and cons to the data model. On the pro side is separation of concerns. CybOX can focus on modeling indicators while STIX can focus on correlating indicators, TTPs, etc. If the same indicator (CybOX) shows up in multiple places, STIX can note that it is the same entity, not multiple instances. On the con side, it is very complex to implement parsers and generators.

The XML thing is purely religious and not based in Computer Science at all. Actually, XML is “less than ideal,” but just about all of the ‘better’ alternatives suffer similar issues. Because it is a religious war and I am not married to XML, I do not really care if the flavor du jure is JSON. What is important is the data model, the semantics, and the relationships between the data elements of STIX. That is an area of active research at Georgetown.

[ed. comment: From the comments so far on the github STIX wiki, the consensus right now from the STIX community is for JSON to be used as the MTI (mandatory to implement) binding for STIX.]

Daly: Given the premise of threat exchange, how are issues with attribution and global object naming handled in STIX?

Burger: Attribution of attack source: that is the whole point of STIX, particularly STIX over IODEF. STIX is looking at the whole attack, as knowing who is doing the attack, and why, can impact what remediation you will take.

Attribution of reports: this is where things get sticky. In theory, an enterprise may not want to be attributed with an indicator, as it may leak that the enterprise was compromised. However, FS-ISAC and others have found that attribution of reports is an important component in the recipient’s calculation of how much trust to put into the report. In other words, attribution may be a critical success factor contributing to whether cyber threat intelligence information exchange is useful and valuable.

Daly: How are your students engaging in CTI initiatives?

Burger: I have a PhD student working on the taxonomies and thesaurus mapping the cyber intelligence information ontology landscape. I have had an MBA student working on the economic models for intelligence sharing, the value of cyber security investment, and government financial incentives for enterprises to invest in information sharing. I have undergraduate students looking at the needs of enterprises with respect to policy and law.

Thanks Eric for sharing your views and insight into the new world of cyber threat intelligence and cyber threat indicator sharing. I believe many of ActiveCyber’s readers will benefit from the research your team is developing to create better cyber threat intelligence to improve our active cyber defenses. I also invite ActiveCyber’s readers to connect with Dr. Burger to learn more about the research his team is performing.

And thanks for checking out! Be on the lookout for more articles and more interviews coming shortly. Please give us your feedback on this interview because we’d love to know some topics you’d like to hear about in the area of active cyber defenses.

Check out our free eBook if you haven’t already – Protecting the Future Enterprise: Active Cyber Defense – The Definitive Guide for Next-Gen Cyber Protections. You can download it at our training and resources page here. Also, email if you’re interested in interviewing or advertising with us at ActiveCyber.

About Dr. Eric Burger

Eric Burger is Research Professor of Computer Science at Georgetown University, where he is also Director of the Georgetown site of the NSF Security and Software Engineering Research Center (S2ERC). One area of S2ERC research is the cyber threat intelligence information exchange ecosystem program. This program has projects looking at and developing technologies such as STIX/TAXII and IODEF/RID, as well as the legal, economic, and policy hurdles to clear to enable enterprises to share cyber threat intelligence.

Prior to Georgetown, Eric was CTO of Neustar, VP Engineering and Deputy CTO at BEA Systems, and CTO of Cantata / Brooktout / and SnowShore Networks. Eric started in telecom at MCI and Cable & Wireless and started in research at the Texas Instruments Central Research Labs and Valid Logic Systems (now Cadence). Eric wrote a number of Internet protocol RFCs in the IETF the areas of real-time multimedia, mobile messaging, and messaging and communication security and is currently active in the development of MILE (IETF) and CTI (OASYS). He serves as a Director of the Public Interest Registry and Ascension Technology Group and as an advisory board member of a number of companies in the security and networking space.