MSc in Computer Science - Specialization in Algorithmics
Aarhus, Denmark
DURATION
4 Semesters
LANGUAGES
English
PACE
Full time
APPLICATION DEADLINE
Request application deadline *
EARLIEST START DATE
Aug 2025
TUITION FEES
EUR 15,300 / per year **
STUDY FORMAT
On-Campus
* 15 January for non-EU citizens and 1 March for EU citizens
** for non-EU/EEA students only | EU/EEA students study for free
Introduction
Specializations in Computer Science
The master’s programme runs over two years with four semesters. Three semesters are dedicated to specialisations or electives. The last semester is for your master’s thesis, which some choose to write in collaboration with a company. Once enrolled, our programme manager will help you complete your master’s programme based on your interests.
Specialization
- Two 30 ECTS specializations
Elective
- The recommendation is a 3rd specialization.
- A small number of elective courses in computer science are offered in addition to specializations. Project work (partly) is also a possibility.
- Elective courses may be supportive rather than core computer science, e.g. extra mathematics courses.
- There may be requirements for the composition of the study program in connection with possible admission.
- In this case, mandatory courses replace elective courses (partly).
Thesis
- Written within the area of specialization 1 or 2.
Algorithmics
1st Sem (Fall): Computational Geometry: Theory and Experimentation (10 ECTS)
2nd Sem (Spring): Randomized Algorithms (10 ECTS)
3rd Sem (Fall): Theory of Algorithms and Computational Complexity (10 ECTS) OR Quantum Information Processing (10 ECTS)
*Semesters are independent – and can be taken in any order
*The third semester may be replaced with Advanced Data Management and Analysis (10 ECTS) from the Data-Intensive Systems group
Admissions
Curriculum
Computational Geometry: Theory and Experimentation
Course content
The main objective of the course is to give participants insight into implementing, testing, and analyzing geometric algorithms, both on small and large data sets. This is done in three parts.
In the first part, the course will cover memory hierarchies of modern computers and their influence on the running time of algorithms, by introducing them to important concepts such as caches, cache misses, pipelining, and branch mispredictions. Here, the participants will also be introduced to guidelines for designing proper tests and procedures to experimentally evaluate their implemented algorithms.
In the second part and connection with the first part, the participants will be introduced to some of the major concepts in external memory data structures and algorithms. The topics include sorting, B-trees and its variants, buffer trees, and distributed sweeping.
In the third part and connection with the previous parts, the participants will be introduced to some of the fundamental data structures and algorithms in computational geometry. They will also be introduced to proofs of correctness, worst-case and expected analysis of geometric algorithms. The topics include convex hull algorithms, linear programming, arrangements, point-line duality, polygon triangulation, orthogonal data structure problems, point location, Voronoi diagrams and Delaunay triangulations.
Description of qualifications
Objectives of the course
The main objective of the course is to give participants insight into implementing, testing, and analyzing geometric algorithms, both on small and large data sets. This is done in three parts.
Randomized algorithms
Course content
Randomization is a crucial technique in modern big data analysis and often allows for both simpler and more efficient algorithms.
In this course, the participants will learn how to apply randomization to solve fundamental tasks in big data analysis, including e.g. nearest neighbour search, frequency estimation, dimensionality reduction and hashing. The participants will learn how to perform a rigorous mathematical analysis of the performance of randomized algorithms, including analysing expected running times and bounding failure probabilities. To do so, the participants will use basic probability theory such as linearity of expectation, the union bound and tail bounds such as Markov's and Chebyshev's inequality and the Chernoff bound. The participants will also gain experience with implementing and testing randomized algorithms on real-life data sets.
Description of qualifications
Objectives of the course
The participants will after the course be able to apply randomization as a technique for designing efficient algorithms and data structures for problems arising in modern data analysis. This includes implementing, testing and giving a mathematical analysis of the efficiency of randomized algorithms based on e.g. expected running times and failure probabilities.
Theory of Algorithms and Computational Complexity
Course content
This course presents fundamental results in theoretical computer science covering basic computational complexity theory and selected algorithmic results.
Computational complexity theory is the study of efficient computation and its fundamental limitations. The theory enables us to precisely state and answer questions of the form: How much time and space is required to solve concrete computational problems? Can randomness or massive parallelism make computation significantly faster? How well can we approximate hard optimization problems?
Algorithmic results covered in the course include fundamental algorithms such as fast matrix multiplication, fast Fourier transform, and Lattice basis reduction, as well as applications of these
Description of qualifications
The participants will after the course have insight into fundamental results and techniques from computational complexity and the theory of algorithms.
At the end of the course, the participants will be able to:
- Distinguish and explain key concepts in complexity theory and algorithmic techniques,
- Describe and analyze the efficiency of advanced algorithms,
- Prove results within complexity theory and about algorithms.
The working method of the course will train the participants to communicate professional issues and to read and understand research papers.
Or
Quantum Information Processing
Course content
The idea of quantum computing arose in part from the difficulty of simulating quantum systems on a standard computer. In the 1980s, some physicists suggested using the quantum system itself as a computational device! The system could simulate itself, but what else could it do? This led to theoretical definitions of a quantum computer.
In the 1990s, many interesting algorithmic applications of quantum computers were found. In particular, Peter Shor found an efficient quantum algorithm for factoring large numbers (opening up the potential for quantum computers to break current-day cryptosystems) and Lov Grover found a quantum algorithm to speed up unstructured search. At the same time, applications of quantum computing that helped in cryptography were also discovered.
In recent years, these theoretical models of quantum computing devices are getting closer to reality, as companies spend vast amounts of resources on building quantum computers.
This course will introduce basic concepts in quantum information, and models of quantum computation, cover the above quantum algorithms, and show how quantum communication can be used to design unconditionally secure protocols for key exchange, something that is impossible with classical communication.
Qualification description
Purpose of the course
Understanding the basic concepts of quantum information, quantum algorithms and applications to quantum cryptography.