Centre for Doctoral Training in Cyber Security

CDT Research Showcase 2019

Wednesday 2nd October 2019

Sultan Nazrin Shah Centre, Worcester College, Oxford

The annual CDT Research Showcase is an opportunity for our invited delegates to meet our current students and recent alumni, discuss their research and explore the wide range of work being undertaken within the CDT in Cyber Security.

This year, the following talks and posters were presented at the Showcase

Research Talks Presented:

Going Dark: A Case Study of Banned Darknet Drug Forums

Selina Cho, CDT17

In March 2018, some of the largest Reddit forums related to darknet markets were banned overnight to the surprise of many users. For the existing users, whose trading activity relies heavily on information sharing, this ban is a threat to the community as a whole as it exists largely only on the virtual interface. This presentation highlights popular discussion topics and mass sentiment of the community that has surfaced as a result in newly founded forums, and assesses the broader implications of disruptions within the drug related darknet community. 

Presentation slides

 

Creating an IDS Rules Framework for Trains and Tanks

Matthew Rogers, CDT18

This talk will cover an intrusion detection markup language akin to Snort designed for Serial Data Bus Networks. The implementation works for all J1939 systems, which encompasses Commercial Trucking, Rail, Weapons Systems, and even some building security systems and medical systems such as MRI machines. We implemented 30,000 J1939 rules that apply to all systems. With this we raise the bar for adversaries, requiring them to infect an Engine Electronic Control Unit (ECU) to send Engine messages. When combined with rules that guarantee messages fit within the standard, the adversary must physically remove the target ECU to add malicious firmware without being detected.

Presentation slides

 

The Alexa in the Room: Designing the Smart Home of the Future

William Seymour, CDT16

Smart home devices have captured the popular imagination, bringing safety, convenience, and silly Alexa easter eggs into homes around the world. But these same devices have also driven widespread concerns over privacy and security that are decidedly less welcome. As designers, it is imperative that we can anticipate and understand these concerns before they cause harm, but investigating privacy is notoriously difficult. Added to this, the cultural ideal of the smart home brings a heady cocktail of historic, marketing, and ideological factors that can make research seem like an impossible task.

This talk explores the techniques used by human computer interaction researchers to explore complex issues around the adoption of devices that have yet to be invented. Current research at Oxford on smart homes serves as a backdrop to illustrate how researchers can come to know and understand what the living room of the future might look like, and how we can use design to address potential concerns before they ever have a chance to arise.

 

Deep Fake Technology

Jack Jackson CDT18

The recent development of Deep Fake technology has led to what can only be described as the largest erosion of trust in human history. If we can no longer trust what we see or hear, what can we trust? Since the dawn of the digital age, humankind has become increasingly dependent upon electronic information exchange in every aspect of our lives. As a species we have become subordinate to these technologies, without which we would be unable to effectively communicate, learn, and ultimately function. It is only natural then, that a large portion of the population (and society as-a-whole), is susceptible to manipulation through the digital medium. This phenomenon can be effectively observed across the life-cycle of technology circa the internet. From the unsubtle Nigerian prince scams to the more eloquent and systematic manipulation of information flow; as humans, we remain vulnerable to deception.

 

 “We’re All Happily Married Here!”: Intimate Partner Violence as a Cybersecurity Issue

Julia Slupska, CDT18

Feminist theorists of international relations (IR) have long argued that binaries of public/private reinforce the subsidiary status given to gendered insecurities, so that these security problems are ‘individualised’ and taken out of the public and political domain. This talk will outline the relevance of feminist critiques of security studies and argue that the emerging field of cybersecurity risks recreating these dynamics by omitting or dismissing gendered technologically-facilitated abuse such as ‘revenge porn’ and intimate partner violence (IPV). I will present a review of forty smart home security analysis papers to show the threat model of IPV is almost entirely absent in this literature. I conclude by outlining some suggestions for cybersecurity research and design, particularly my work on “abusability testing”, and reaffirming the importance of critical studies of information architecture.

Presentation slides

 

Negotiation Transparency in Configurable Protocols: A Case Study on the TLS Protocol and the Forward Secrecy Property

Eman Alashwali, CDT 15

When a TLS client proposes optimal configurations, but the server selects less than optimal configurations, e.g. a lower protocol version or a non-Forward Secure (non-FS) key-exchange, the expected reason behind this agreement is that the server’s maximum configurations do not meet the client’s optimal configurations. However, there are various less expected reasons at the server side: misconfiguration, discrimination, malware, or implementation bugs. With current models and techniques of parameters negotiation in configurable protocols, it is not possible for the client to distinguish between the aforementioned reasons. Previous work on downgrade attacks assumes a man-in-the-middle adversarial model, and a trusted server that behaves honestly and correctly. In our work, we minimise trust assumptions on servers, as a result of reasoning about a new adversarial model which we call the “discriminatory” adversarial model. Through an empirical analysis on the TLS protocol and the FS property on over 10M real-world servers, using a novel heuristic approach, we show the realism of our model, and that servers’ discrimination can go unnoticed. Our study shows 5.37% of top domains, 7.51% of random domains, and 26.16% of random IPs do not select FS key-exchange algorithms. Surprisingly, 39.20% of the top domains, 24.40% of the random domains, and 14.46% of the random IPs that do not select FS, do support FS. Finally we discuss possible paths towards forward secure Internet traffic. Apart from deprecating non-FS key-exchange algorithms, we propose a client-side best effort approach which tries to guide (force) misconfigured servers towards FS by exclusively offering, either sequentially or in-parallel, the ciphersuites that provide the desired properties (e.g. FS, or FS and AE), before falling back to the default configurations.

Presentation slides

 

Surveillance for Sale: How China and Russia Export Repression Technology and Techniques

Valentin Weber, CDT17

The global diffusion of Chinese and Russian information control technology and techniques has been featured prominently in the headlines of major international newspapers. Few stories, however, have provided a systematic analysis of both the drivers and outcomes of such diffusion. This talk does so – and finds these information controls are spreading more efficiently to countries with hybrid or authoritarian regimes, particularly those which have ties to China or Russia. Chinese information controls spread more easily to countries along the Belt and Road Initiative; Russian controls to countries within the Commonwealth of Independent States. In arriving at these findings, this talk first defines the Russian and Chinese models of information control and then traces the diffusion of these controls to the 110 countries within the Chinese and Russian technological spheres, which are geographical areas and spheres of influence that Russian and Chinese information control technology, techniques of handling information, and law have diffused to.  

Presentation slides

 

Now You [Don’t] See Me: How have the GDPR and changing public awareness of the UK surveillance state impacted OSINT investigations?

Anjuli Shere, CDT18

This project surveys 16 open-source gathering and analysis (OSINT) practitioners across public and private sectors to determine: Firstly, what, if any, impact the implementation of the GDPR has had on their ability to successfully operate as OSINT analysts and, secondly, if they have noticed any subsequent changes in UK public perception around issues of the surveillance state and digital privacy. I argue that this initial survey shows that the GDPR is merely a first step in establishing societal expectations and regulations around digital privacy. While some adaptations in OSINT practice have been reported, to date, few substantive novel changes to digital OSINT methods or analysis have resulted or seem poised to take effect, one year since the implementation of the GDPR in the UK. 

Presentation slides

 

Research Posters Displayed: