Our systems are now restored following recent technical disruption, and we’re working hard to catch up on publishing. We apologise for the inconvenience caused. Find out more

Recommended product

Popular links

Popular links


Kernelization

Kernelization

Kernelization

Theory of Parameterized Preprocessing
Fedor V. Fomin , Universitetet i Bergen, Norway
Daniel Lokshtanov , Universitetet i Bergen, Norway
Saket Saurabh , Institute of Mathematical Sciences, India, and Universitetet i Bergen, Norway
Meirav Zehavi , Ben-Gurion University of the Negev, Israel
June 2019
Hardback
9781107057760

Looking for an inspection copy?

This title is not currently available for inspection. However, if you are interested in the title for your course we can consider offering an inspection copy. To register your interest please contact [email protected] providing details of the course you are teaching.

$82.00
USD
Hardback
eBook

    Preprocessing, or data reduction, is a standard technique for simplifying and speeding up computation. Written by a team of experts in the field, this book introduces a rapidly developing area of preprocessing analysis known as kernelization. The authors provide an overview of basic methods and important results, with accessible explanations of the most recent advances in the area, such as meta-kernelization, representative sets, polynomial lower bounds, and lossy kernelization. The text is divided into four parts, which cover the different theoretical aspects of the area: upper bounds, meta-theorems, lower bounds, and beyond kernelization. The methods are demonstrated through extensive examples using a single data set. Written to be self-contained, the book only requires a basic background in algorithmics and will be of use to professionals, researchers and graduate students in theoretical computer science, optimization, combinatorics, and related fields.

    • Revisits the same data set to demonstrate the appropriate uses for different methods
    • Features extended examples to help students build practical intuition and understand the motivation behind the theory
    • Surveys all four main aspects of kernelization and the relations between them

    Reviews & endorsements

    'Kernelization is one of the most important and most practical techniques coming from parameterized complexity. In parameterized complexity, kernelization is the technique of data reduction with a performance guarantee. From humble beginnings in the 1990's it has now blossomed into a deep and broad subject with important applications, and a well-developed theory. Time is right for a monograph on this subject. The authors are some of the leading lights in this area. This is an excellent and well-designed monograph, fully suitable for both graduate students and practitioners to bring them to the state of the art. The authors are to be congratulated for this fine book.' Rod Downey, Victoria University of Wellington

    'Kernelization is an important technique in parameterized complexity theory, supplying in many cases efficient algorithms for preprocessing an input to a problem and transforming it to a smaller one. The book provides a comprehensive treatment of this active area, starting with the basic methods and covering the most recent developments. This is a beautiful manuscript written by four leading researchers in the area.' Noga Alon, Princeton University, New Jersey and Tel Aviv University

    'This book will be of great interest to computer science students and researchers concerned with practical combinatorial optimization, offering the first comprehensive survey of the rapidly developing mathematical theory of pre-processing - a nearly universal algorithmic strategy when dealing with real-world datasets. Concrete open problems in the subject are nicely highlighted.' Michael Fellows, Universitetet i Bergen, Norway

    'The study of kernelization is a relatively recent development in algorithm research. With mathematical rigor and giving the intuition behind the ideas, this book is an excellent and comprehensive introduction to this new field. It covers the entire spectrum of topics, from basic and advanced algorithmic techniques to lower bounds, and goes beyond these with meta-theorems and variations on the notion of kernelization. The book is suitable for students wanting to learn the field as well as experts, who would both benefit from the full coverage of topics.' Hans L. Bodlaender, Universiteit Utrecht

    'The book is well written and provides a wealth of examples to illustrate concepts, while being succinct.' D. Papamichail, Choice

    'The book does a good job in several ways: it can serve as the first textbook on this flourishing area of research; it is also very useful for self-study, as it contains quite a number of exercises, with further pointers to the literature. In addition, it gives quite a good overview of the present state-of-the-art and can therefore help researchers in the area to discover results that (s)he might have missed due to the speed in which the area has developed over the last decade.' Henning Fernau, MathSciNet

    'This book studies the research area of kernelization, which consists of the techniques used for data reduction via pre-processing in order to speed up data analysis computations … the book explores very novel and complex ideas, it is well written with attention to detail and easy to follow. The book concludes with a useful list of relevant references.' Efstratios Rappos, zbMATH

    'The book manages to present an incredible number of techniques, methods, and examples in its 528 pages. Each chapter ends with a bibliographic notes section, which often provides some small historical context for the material covered. It also points to more current results and papers although it does so very briefly. Together, this makes the textbook a valuable resource book to researchers.' Tim Jackman and Steve Homer, SIGACT News

    See more reviews

    Product details

    June 2019
    Hardback
    9781107057760
    528 pages
    235 × 157 × 31 mm
    0.88kg
    Available

    Table of Contents

    • 1. What is a kernel?
    • Part I. Upper Bounds:
    • 2. Warm up
    • 3. Inductive priorities
    • 4. Crown decomposition
    • 5. Expansion lemma
    • 6. Linear programming
    • 7. Hypertrees
    • 8. Sunflower lemma
    • 9. Modules
    • 10. Matroids
    • 11. Representative families
    • 12. Greedy packing
    • 13. Euler's formula
    • Part II. Meta Theorems:
    • 14. Introduction to treewidth
    • 15. Bidimensionality and protrusions
    • 16. Surgery on graphs
    • Part III. Lower Bounds:
    • 17. Framework
    • 18. Instance selectors
    • 19. Polynomial parameter transformation
    • 20. Polynomial lower bounds
    • 21. Extending distillation
    • Part IV. Beyond Kernelization:
    • 22. Turing kernelization
    • 23. Lossy kernelization.
      Authors
    • Fedor V. Fomin , Universitetet i Bergen, Norway

      Fedor V. Fomin is Professor of Computer Science at the Universitetet i Bergen, Norway. He is known for his work in algorithms and graph theory. He has co-authored two books, Exact Exponential Algorithms (2010) and Parameterized Algorithms (2015), and received the EATCS Nerode prizes in 2015 and 2017 for his work on bidimensionality and Measure and Conquer.

    • Daniel Lokshtanov , Universitetet i Bergen, Norway

      Daniel Lokshtanov is Professor of Informatics at the Universitetet i Bergen, Norway. His main research interests are in graph algorithms, parameterized algorithms, and complexity. He is a co-author of Parameterized Algorithms (2015) and is a recipient of the Meltzer prize, the Bergen Research Foundation young researcher grant, and an ERC starting grant on parameterized algorithms.

    • Saket Saurabh , Institute of Mathematical Sciences, India, and Universitetet i Bergen, Norway

      Saket Saurabh is Professor of Theoretical Computer Science at the Institute of Mathematical Sciences, Chennai, and Professor of Computer Science at the Universitetet i Bergen, Norway. He has made important contributions to every aspect of parametrized complexity and kernelization, especially to general purpose results in kernelization and applications of extremal combinatorics in designing parameterized algorithms. He is a co-author of Parameterized Algorithms (2015).

    • Meirav Zehavi , Ben-Gurion University of the Negev, Israel

      Meirav Zehavi is Assistant Professor of Computer Science at Ben-Gurion University. Her research interests lie primarily in the field of parameterized complexity. In her Ph.D. studies, she received three best student paper awards.