Author Archives: acadia

15/09/2016 – Talk by Matus Nemec

Title: The Million-Key Question – Investigating the Origins of RSA Public Keys
Time: 10:45
Location: Meeting room
Type: Research Result
Speaker: Matus Nemec
Abstract: Can bits of an RSA public key leak information about design and implementation choices such as the prime generation algorithm? We analysed over 60 million freshly generated key pairs from 22 open- and closed source libraries and from 16 different smartcards, revealing significant leakage. The bias introduced by different choices is sufficiently large to classify a probable library or smartcard with high accuracy based only on the values of public keys. Such a classification can be used to decrease the anonymity set of users of anonymous mailers or operators of linked Tor hidden services, to quickly detect keys from the same vulnerable library or to verify a claim of use of secure hardware by a remote party.

Our broad inspection provides a sanity check and deep insight regarding which of the recommendations for RSA key pair generation are followed in practice, including closed-source libraries and smartcards. The classification of the key origins of more than 10 million RSA-based IPv4 TLS keys and 1.4 million PGP keys also provides an independent estimation of the libraries that are most commonly used to generate the keys found on the Internet.

Talk based on:

Title: EACirc – Advancing cryptanalytic methods through evolutionary computing
Time: 11:30
Location: Meeting room
Type: Research Result
Speaker: Matus Nemec
Abstract:Cryptographic algorithms (primitives and protocols) always have to go through elaborate testing by skilled experts to assert their overall security. The history carries many examples of serious flaws in cryptographic algorithms that rise concerns about the design and strength of the primitives.

Some automation is possible in the first phases of a cryptanalysis, e.g., by using randomness testing suites. These tools can be applied to check statistical properties of cryptographic algorithm output, to look for a deviation from randomness. Such a defect signalizes a potential flaw in the algorithm design. Yet such testing suites are limited only to predefined pattern testing for certain statistical defects, therefore others will go unnoticed.

We propose a more open approach based on an automatically generated distinguisher in the form of a multiple-output logic function (MOLF) designed by evolutionary algorithms to search for undesired statistical anomalies in the algorithm output. With contrast to the fixed sets of tests, our approach builds a distinguisher automatically, on-the-fly and adaptively to the evaluated algorithm output. This opens up new possibilities for discovering potential weaknesses that remained hidden from statistical tests and even cryptanalyst sights.

15/09/2016 – Talk by Francesco Palmarini

Title: APDU-level attacks in PKCS#11 devices
Time: 10:00
Location: Meeting room
Type: Research Result
Speaker: Francesco Palmarini
Abstract: We describe attacks on PKCS#11 devices that we successfully mounted by interacting with the low-level APDU protocol, used to communicate with the device. They exploit proprietary implementation weaknesses which allow attackers to bypass the security enforced at the PKCS#11 level. Some of the attacks leak, as cleartext, sensitive cryptographic keys in devices that were previously considered secure. We present a new threat model for the PKCS#11 middleware and we discuss the new attacks with respect to various attackers and application configurations. All the attacks presented in this paper have been timely reported to manufacturers following a responsible disclosure process

Note: The work will presented at the 19th International Symposium on Research in Attacks, Intrusions and Defenses (RAID 2016) next week.

14/09/2016 – Talk by prof. Jia Yuan Yu (Concordia Institute of Information System Engineering)

Title: Central-Limit Approach to Risk-aware Markov Decision Processes
Time: 14:00
Location: Meeting room
Type: Research Result
Speaker: Jia Yuan Yu
Abstract: Whereas classical Markov decision processes maximize the expected reward, we consider minimizing the risk. We propose to evaluate the risk associated to a given policy over a long-enough time horizon with the help of a central limit theorem. The proposed approach works whether the transition probabilities are known or not. We also provide a gradient-based policy improvement algorithm that converges to a local optimum of the risk objective.

07/09/2016 – Talk by Mohsin Jafri

Title:  Simulating Depth-based routing in Underwater Networks
Time: 13:45
Location: Meeting room
Type: Research Result
Speaker: Mohsin Jafri
Abstract: In this talk, I will briefly discuss my ongoing research activities. I will present the implementation of a simulator for
studying Depth-Based Routing (DBR) in Underwater Wireless Sensor Networks (UWSNs). One of the major challenges for DBR is the configuration of a parameter called “holding time” which influences the network throughput, response time and energy consumption. I will show how it is possible to use the simulator to support the validation of a choice for a given value of the holding time, as well as other network parameters. The presentation will also discuss the important features that distinguish the implemented simulator with respect to the other ones available in the scientific literature and its validation against previous simulation models or measurements.

07/09/2016 – Talk by Francesco di Giacomo

Title:  Building Domain Specific Languages with the Metacasanova meta-compiler
Time: 14:00
Location: Meeting room
Type: Research Result
Speaker: Francesco di Giacomo
Abstract: main specific languages (DSL’s) are becoming more popular thanks to their ability to provide abstractions that are not part of general purpose languages that ease the development process in specific scenarios, ranging from video games to web development.
Creating a new DSL requires to build a compiler for it, which is a very complex piece of software. This complexity usually causes two problems: (i) the development of the compiler itself requires a significant amount of manpower, and (ii) adding new features to the language may become problematic. In this talk we propose a different approach to writing a hard-coded compiler by using a meta-compiler, i.e. a software that takes as an input the language definition and a program written in that language, and outputs a corresponding executable code. We show the advantages of this approach in terms of simplicity of design and coding requirements, and we present a performance analysis on the generated code for Casanova 2, a DSL for game software development.

27/07/2016 – Talk by Enrico Steffinlongo

Title:  Efficent Static Detection of Collusion Attacks in ARBAC-based Workflow Systems
Time: 14:45
Location: Acadia Lab
Type: Research Result
Speaker: Enrico Steffinlongo
Authorization in workflow systems is usually built on top of role-based access control (RBAC); security policies on workflows are then expressed as constraints on the users performing a set of tasks and the roles assigned to them. Unfor-tunately, when role administration is distributed and potentially untrusted users contribute to the role assignment process, like in the case of Administrative RBAC (ARBAC), collusions may take place to circumvent the intended workflow security policies. In a collusion attack, a set of users of a workflow system collaborates by changing the user-to-role assignment, so as to sidestep the security policies and run up to completion a workflow they could not complete otherwise.
In this paper, we study the problem of collusion attacks in a formal model of workflows based on stable event structures and we define a precise notion of security against collusion. We then propose a static analysis technique based on a reduction to a role reachability problem for ARBAC, which can be used to prove or disprove security for a large class of workflow systems. We also discuss how to aggressively optimise the obtained role reachability problem to ensure its tractability. Finally, we implement our analysis in a tool, WARBAC, and we experimentally show its effectiveness on a set of publicly available examples, including a realistic case study.

27/07/2016 – Talk by Marco Squarcina

Title:  Relation on ongoing PhD program
Time: 14:00
Location: Acadia Lab
Type: Research Result
Speaker: Marco Squarcina

The aim of this talk is to briefly report on my ongoing research activities. After presenting the accepted and in-progress papers, I will focus on the results achieved during the internship at Cryptosense in Paris where I carried out the analysis of several Java keystores (storage facilities for cryptographic keys and certificates) exposing
design and implementation weaknesses. I am currently investigating on the cracking-resistance of the keystores against brute force attacks and I plan to support my findings by implementing password cracking plugins for popular tools such as jtr or hashcat.

27/04/2016 – Tutorial day

Tutorial day

The ACADIA research centre organisms a tutorial day on hot topics in computer science.
Date: 27/04/2016
Location: Conference room, building Alpha, Scientific campus


– 11:00-13:00  Dale Miller (Ecole Polytechnique, Paris, France): “Communicating and trusting formal proofs”

– 14:00-17:00 Young Im Cho (Gachon Univ., Seoul, Korea): “Intelligent IoT Platforms for Smart City”

Abstract 1 (Communicating and trusting formal proofs)

In the mist of feeling insecure with our electronic communications, we can take solace in the advice that we can “trust the math” behind cryptography. Faced with concerns over the safety of fly-by-wire planes and self-driving cars, we should be able to find solace in the fact that some parts of our safety critical systems have been proved formally correct: that is, we should be able to “trust the proof”. The
way formal proofs are built today, however, makes trusting them difficult. While proof assistants are used to build formal proofs, those proofs are often locked to that assistant. Formal proofs produced by one proof assistant cannot usually be checked and trusted by another proof assistant nor by a future version of itself. Thus, one of the components of the scientific method that ensures trust-reproducibility – is missing from proof checking.
The field of proof theory, as initiated by Gentian in the 1930’s, has identified a great deal of structure in formal proofs. Given recent developments in this topic, it is now possible to apply what we know
about proofs in theory to what we would like to have for proofs in practice. To that end, I will present the Foundational Proof Certificates framework for defining the semantics of proof evidence.
Since such definitions are executable, it is possible to build proof checkers that can check a wide range of formal proofs. In this scheme, the logic engine that executes such definitions is the only thing that needs to be trusted. Since such a logic engine is based on well-known computational logic topics, anyone can write a logic engine in their choice of programming language and hardware in order for them to build a checker they can trust. I will also overview some of the consequences of defining proofs in this manner: provers can become networked far more richly than they are currently and both libraries and marketplaces of proofs are enabled in a technology independent


Abstract 2 (Intelligent IoT Platforms for Smart City)

Machine-to-Machine communications (M2M) is a phenomenon that has been proceeding quietly in the background, and it is coming into the phase where explosion of usage scenarios in businesses will happen. Sensors, actuators, RFID/NFC tags, vehicles, and intelligent machines all have
the ability to communicate. The number of M2M connections is continuously increasing, and it has been predicted to see billions of machines interconnected in a near future. M2M applications provide
advantages in various domains from building, energy, healthcare, industrial, transportation, retail, security to environmental services. This fast-growing ecosystem is leading M2M towards a promising future, however M2M market expansion opportunities are not straight forward.
M2M is suffering from the highly fragmented vertical domain specific approach, which has increased the R&D cost in each specific domain. In fact, various vertical M2M solutions have been designed independently and separately for different applications, which inevitably impacts or even impedes large-scale M2M deployment. The existence of multiple manufacturers, the lack of a common M2M Service Capability Layer and no clarity about what can be achieved have all combined to leave the field of M2M communications closer to dream than reality.
To reduce the standardization gap which exist between M2M domains, the ETSI Technical committee M2M defined an end to end M2M service platform with the intermediate service layer that are key components of the horizontal M2M solution. This standards based platform follows a RESTful approach with open interfaces to enable developing services and applications independently of the underlying network, thus easing the deployment of vertical applications for an effective interoperability, and facilitating innovation across industries.
The following topics will be covered by the tutorial:
– IoT basics
– Smart City basics
– IoT international platform focusing on OneM2M reference model
– Components and model description for IoT platform
– Examples in detail
– Case studies and demonstration etc.

6 open fully funded PhD positions in Computer Science. Deadline: April, 21st

Ca’ Foscari University, Venice, announces 6 open fully funded PhD position in Computer Science, one of which on the following specific subject:

– “Data Science”, funded by ZHAW (Zurich University of Applied Sciences) – School of Engineering (Switzerland). More information at

PhD students will have the opportunity to do their work in one of the three well-established research centers:

– KIIS (
– Dhv (Digital Humanities Venice), in collaboration with EPF de Lausanne and Telecom Italia

The PhD students will be based in the modern scientific campus of Mestre (, reachable in only 10 minutes from Venice by public transportation. The campus is highly international with both Master and PhD courses completely held in English.

For more information please contact the program coordinator Prof. Riccardo Focardi (

Application deadline is April, 21st 2016 1 PM (Italian time) and can be done on-line at:

– English:
– Italian:

See more info at this page

23/03/2016 – Talk by Renzo Derosas

Title:  Modelling a Demographic Suicide: the Venetian Aristocracy, 1500 to 1800.
Time: 14:30
Location: Meeting room, building Zeta
Type: Research Result
Speaker: Renzo Derosas
BACKGROUND: Around 1500, the Republic of Venice was a leading European power, ruling over an empire that stretched from the current Swiss and Austrian borders on the north, all along the Adriatic coast, to the islands of Cyprus and Crete in the southeast. Although its political regime was celebrated as a mix a democratic, aristocratic, and monarchical features, political power was the monopoly of a limited number of aristocratic families who passed it on from generation to generation. Indeed, Venice was defined “a republic of families”, whose destinies were fully identified with the State they belonged to. On the one hand, the very survival of the Republic depended on the reproduction – demographic, economic, and cultural – of its ruling class; on the other hand, there was hardly a place for the Venetian nobility without the State that supported its privileged status.
The number of aristocratic families grew from ca. 600 in 1400 to a maximum of 900 in 1500, and then declined steadily to a low of 350 at the fall of the Republic (1797). Correspondingly, the noblemen grew from 900 in 1400 to 2,650 in 1500, to fall to 1,100 in 1797. Although contemporaries were aware of the problems deriving from such a trend, all efforts to invert it turned out ineffective. Historians agree that the decline originated in the peculiar marriage policy adopted by the aristocracy: once Venetian nobles converted from merchants into landowners, they allowed only one male per generation to marry. They were obsessed with keeping their assets undivided, since any branching out would jeopardize their social and political standing. Unfortunately, the outcome was disastrous under both the demographic and the economic standpoints. At the fall of the Republic, due to the hazards of reproduction, almost two thirds of the families extinguished, while most of the survivors were heavily impoverished.

OBJECTIVES: With this research project, we challenge the current interpretation of the demographic decline of the Venetian aristocracy. We argue that it is incomplete, inasmuch it does not consider that many noblemen rather remained single than marry, condemning their families to extinction. Such a behavior was deeply contrary to the ethics and the ideology of the aristocracy. How can we explain it? Our aim is to develop a theory that accounts for both the decision to marry or not to marry.

METHODS: Our starting point is the assumption that marital choices in the Venetian aristocracy were primarily a political option, undertaken to enhance a family’s position in the political arena through the acquisition of new kin. More specifically, we hypothesize that, when it came to establish a new and lasting alliance through the marriage of their members, families strove to maximize their relational capital they would acquire. Our related hypothesis is that when the advantage deriving from all potential matches was below a certain threshold, a family would rather abstain from joining the market.
Although the strategic importance of marital choices by the Venetian aristocracy is largely circulating in the literature, the evidence provided thus far was only anecdotal and rather loosely defined. To test our hypothesis, we proceed in a tentative and exploratory way. Since we have only a list of the marriages celebrated along time but not of the population at risk, we define a set of potential partners made up by all those who married in the same year. We then outline the potential ego-networks formed by each match, and compute a series of centrality measures associated to each ego-network. Finally, we run a logistic regression to test whether any of such measures affects the likelihood of a match.