Adversarial Attacks on Neural Networks

schedule Aug 9th 12:00 - 12:45 PM place Jupiter

Since 2014, adversarial examples in Deep Neural Networks have come a long way. This talk aims to be a comprehensive introduction to adversarial attacks including various threat models (black box/white box), approaches to create adversarial examples and will include demos. The talk will dive deep into the intuition behind why adversarial examples exhibit the properties they do — in particular, transferability across models and training data, as well as high confidence of incorrect labels. Finally, we will go over various approaches to mitigate these attacks (Adversarial Training, Defensive Distillation, Gradient Masking, etc.) and discuss what seems to have worked best over the past year.

 
3 favorite thumb_down thumb_up 2 comments visibility_off  Remove from Watchlist visibility  Add to Watchlist
 

Outline/Structure of the Talk

We will follow the following outline for the presentation:

  • What are Adversarial attacks?
  • CIA Model of Security
  • Threat models
  • Examples and demos of Adversarial attacks
  • Proposed Defenses against adversarial attacks
  • Intuition behind Adversarial attacks
  • What’s next?

Learning Outcome

This talk is motivated by the question: Are adversarial examples simply a fun toy problem for researchers or an example of a deeper and more chronic frailty in our models? The learning outcome for attendees from this talk is to realize that Deep Learning Models are just another tool, susceptible to adversarial attacks. These can have huge implications, especially in a world with self-driving cars and other automation.

Target Audience

Deep Learning Practitioners or students interested in learning more about an up-and-coming area of research in this field.

Prerequisites for Attendees

A beginner-level understanding of how Deep Neural Networks work.

schedule Submitted 3 months ago

Public Feedback

comment Suggest improvements to the Speaker
  • Deepti Tomar
    By Deepti Tomar  ~  1 month ago
    reply Reply

    Dear Anant,

    Thanks for your submission. Adversarial Attacks is certainly an important topic.

    You've mentioned about mitigating the attacks using various approaches. Would you be sharing any specific example/use case that you've worked on related to this, in your session?

    Thanks,

    Deepti

    • Anant Jain
      By Anant Jain  ~  1 month ago
      reply Reply

      Hi Deepti,

      Thanks for considering the proposal. The two broad mitigation approaches I was planning to cover were Adversarial Training (where you generate adversarial examples using multiple known approaches and include them in your training set), and Defensive Distillation (where we learn a secondary model to mimic what a primary model has learned). Both approaches have shown limited success, as shown in "Obfuscated Gradients Give a False Sense of Security: Circumventing Defenses to Adversarial Examples" paper and other literature. I will mention the approaches used in the NIPS competition, along with real-world code/resources attendees can refer to implement the same.

      Besides the toy/research-paper examples, I was planning to work a popular recent news story into the talk. Earlier this month, researchers at Tencent's Keen Security Labs tricked Tesla's Autopilot to swerve into the wrong lane using just physical stickers. (article, paper). I'll dive deeper into the specifics of how this was done, and use this as the main example to demonstrate the impact adversarial attacks can have on real-world AI problems.

      Thanks,

      Anant


  • Liked Favio Vázquez
    keyboard_arrow_down

    Favio Vázquez - Complete Data Science Workflows with Open Source Tools

    90 Mins
    Tutorial
    Beginner

    Cleaning, preparing , transforming, exploring data and modeling it's what we hear all the time about data science, and these steps maybe the most important ones. But that's not the only thing about data science, in this talk you will learn how the combination of Apache Spark, Optimus, the Python ecosystem and Data Operations can form a whole framework for data science that will allow you and your company to go further, and beyond common sense and intuition to solve complex business problems.

  • Liked Dipanjan Sarkar
    keyboard_arrow_down

    Dipanjan Sarkar - Explainable Artificial Intelligence - Demystifying the Hype

    Dipanjan Sarkar
    Dipanjan Sarkar
    Data Scientist
    Red Hat
    schedule 4 months ago
    Sold Out!
    45 Mins
    Tutorial
    Intermediate

    The field of Artificial Intelligence powered by Machine Learning and Deep Learning has gone through some phenomenal changes over the last decade. Starting off as just a pure academic and research-oriented domain, we have seen widespread industry adoption across diverse domains including retail, technology, healthcare, science and many more. More than often, the standard toolbox of machine learning, statistical or deep learning models remain the same. New models do come into existence like Capsule Networks, but industry adoption of the same usually takes several years. Hence, in the industry, the main focus of data science or machine learning is more ‘applied’ rather than theoretical and effective application of these models on the right data to solve complex real-world problems is of paramount importance.

    A machine learning or deep learning model by itself consists of an algorithm which tries to learn latent patterns and relationships from data without hard-coding fixed rules. Hence, explaining how a model works to the business always poses its own set of challenges. There are some domains in the industry especially in the world of finance like insurance or banking where data scientists often end up having to use more traditional machine learning models (linear or tree-based). The reason being that model interpretability is very important for the business to explain each and every decision being taken by the model.However, this often leads to a sacrifice in performance. This is where complex models like ensembles and neural networks typically give us better and more accurate performance (since true relationships are rarely linear in nature).We, however, end up being unable to have proper interpretations for model decisions.

    To address and talk about these gaps, I will take a conceptual yet hands-on approach where we will explore some of these challenges in-depth about explainable artificial intelligence (XAI) and human interpretable machine learning and even showcase with some examples using state-of-the-art model interpretation frameworks in Python!

  • Liked Dat Tran
    keyboard_arrow_down

    Dat Tran - Image ATM - Image Classification for Everyone

    Dat Tran
    Dat Tran
    Head of AI
    Axel Springer AI
    schedule 3 months ago
    Sold Out!
    45 Mins
    Talk
    Intermediate

    At idealo.de we store and display millions of images. Our gallery contains pictures of all sorts. You’ll find there vacuum cleaners, bike helmets as well as hotel rooms. Working with huge volume of images brings some challenges: How to organize the galleries? What exactly is in there? Do we actually need all of it?

    To tackle these problems you first need to label all the pictures. In 2018 our Data Science team completed four projects in the area of image classification. In 2019 there were many more to come. Therefore, we decided to automate this process by creating a software we called Image ATM (Automated Tagging Machine). With the help of transfer learning, Image ATM enables the user to train a Deep Learning model without knowledge or experience in the area of Machine Learning. All you need is data and spare couple of minutes!

    In this talk we will discuss the state-of-art technologies available for image classification and present Image ATM in the context of these technologies. We will then give a crash course of our product where we will guide you through different ways of using it - in shell, on Jupyter Notebook and on the Cloud. We will also talk about our roadmap for Image ATM.

  • Liked Rahee Walambe
    keyboard_arrow_down

    Rahee Walambe / Vishal Gokhale - Processing Sequential Data using RNNs

    480 Mins
    Workshop
    Beginner

    Data that forms the basis of many of our daily activities like speech, text, videos has sequential/temporal dependencies. Traditional deep learning models, being inadequate to model this connectivity needed to be made recurrent before they brought technologies such as voice assistants (Alexa, Siri) or video based speech translation (Google Translate) to a practically usable form by reducing the Word Error Rate (WER) significantly. RNNs solve this problem by adding internal memory. The capacities of traditional neural networks are bolstered with this addition and the results outperform the conventional ML techniques wherever the temporal dynamics are more important.
    In this full-day immersive workshop, participants will develop an intuition for sequence models through hands-on learning along with the mathematical premise of RNNs.

  • 45 Mins
    Demonstration
    Intermediate

    Artificial Intelligence (AI) has been rapidly adopted in various spheres of medicine such as microbiological analysis, discovery of drug, disease diagnosis, Genomics, medical imaging and bioinformatics for translating biomedical data into improved human healthcare. Automation in healthcare using machine learning/deep learning assists physicians to make faster, cheaper and more accurate diagnoses.

    We have completed three healthcare projects using deep learning and are currently working on three more healthcare projects. In this session, we shall demonstrate two deep learning based healthcare applications developed using TensorFlow. The discussion of each application will include the following: problem statement, proposed solution, data collected, experimental analysis and challenges faced to achieve this success. Finally, we will briefly discuss the other applications on which we are currently working and the future scope of research in this area.

  • 90 Mins
    Tutorial
    Intermediate

    Machine learning and deep learning have been rapidly adopted in providing solutions to various problems in medicine. If you wish to build scalable machine learning/deep learning-powered healthcare solutions, you need to understand how to use tools to build them.

    The TensorFlow is an open source machine learning framework. It enables the use of data flow graphs for numerical computations, with automatic parallelization across several CPUs, GPUs or TPUs. Its architecture makes it ideal for implementing machine learning/deep learning algorithms.

    This tutorial will provide hands-on exposure to implement Deep Learning based healthcare solutions using TensorFlow.

  • Liked Maryam Jahanshahi
    keyboard_arrow_down

    Maryam Jahanshahi - Applying Dynamic Embeddings in Natural Language Processing to Analyze Text over Time

    Maryam Jahanshahi
    Maryam Jahanshahi
    Research Scientist
    TapRecruit
    schedule 4 months ago
    Sold Out!
    45 Mins
    Case Study
    Intermediate

    Many data scientists are familiar with word embedding models such as word2vec, which capture semantic similarity of words in a large corpus. However, word embeddings are limited in their ability to interrogate a corpus alongside other context or over time. Moreover, word embedding models either need significant amounts of data, or tuning through transfer learning of a domain-specific vocabulary that is unique to most commercial applications.

    In this talk, I will introduce exponential family embeddings. Developed by Rudolph and Blei, these methods extend the idea of word embeddings to other types of high-dimensional data. I will demonstrate how they can be used to conduct advanced topic modeling on datasets that are medium-sized, which are specialized enough to require significant modifications of a word2vec model and contain more general data types (including categorical, count, continuous). I will discuss how my team implemented a dynamic embedding model using Tensor Flow and our proprietary corpus of job descriptions. Using both categorical and natural language data associated with jobs, we charted the development of different skill sets over the last 3 years. I will specifically focus the description of results on how tech and data science skill sets have developed, grown and pollinated other types of jobs over time.

  • Liked Saurabh Jha
    keyboard_arrow_down

    Saurabh Jha / Usha Rengaraju - Hands on Deep Learning for Computer Vision

    480 Mins
    Workshop
    Intermediate

    Computer Vision has lots of applications including medical imaging, autonomous vehicles, industrial inspection and augmented reality. Use of Deep Learning for computer Vision can be categorized into multiple categories for both images and videos – Classification, detection, segmentation & generation.

    Having worked in Deep Learning with a focus on Computer Vision have come across various challenges and learned best practices over a period experimenting with cutting edge ideas. This workshop is for Data Scientists & Computer Vision Engineers whose focus is deep learning. We will cover state of the art architectures for Image Classification, Segmentation and practical tips & tricks to train a deep neural network models. It will be hands on session where every concepts will be introduced through python code and our choice of deep learning framework will be PyTorch v1.0.

    The workshop takes a structured approach. First it covers basic techniques in image processing and python for handling images and building Pytorch data loaders. Then we introduce how to build image classifier followed by how segmentation was done in pre CNN era and cover clustering techniques for segmentation. Start with basics of neural networks and introduce Convolutional neural networks and cover advanced architecture – Resnet. Introduce the idea of Fully Convolutional Paper and it’s impact on Semantic Segmentation. Cover latest semantic segmentation architecture with code and basics of scene text understanding in pytorch with how to run carefully designed experiments using callbacks, hooks. Introduce discriminative learning rate and mixed precision to train deep neural network models. Idea is to bridge the gap between theory and practice and teach how to run practical experiments and tune deep learning based systems by covering tricks introduced in various research papers. Discuss in-depth on the interaction between batchnorm, weight decay and learning rate.

  • Liked Rahee Walambe
    keyboard_arrow_down

    Rahee Walambe / Aditya Sonavane - Can AI replace Traditional Control Algorithms?

    45 Mins
    Case Study
    Beginner

    As the technology progresses, the control tasks are getting increasingly complex. Employing the targeted algorithms for such control tasks and manually tuning them by trial and error (as in case of PID), is a cumbersome and lengthy process. Additionally, methods such as PID are designed for linear systems, however, all the real world control tasks are inherently non-linear in nature. With such complex tasks, using the conventional linear control methods approximates the nonlinear system to a linear model and in effect required performance is difficult to achieve.

    The new advances in the field of AI have presented us with techniques which may help replace the traditional control algorithms. Use of AI may allow us to achieve a higher quality of control on the nonlinear process, with minimum human interaction. Thus eliminating the requirement for a skilled person to perform meager tasks of tuning control algorithms with trial and error.

    Here we consider a simple case study of a beam balancer, where the controller is used for balancing a beam on a pivot to stabilize the ball at the center of the beam. We aim to implement a Reinforcement Learning based controller as an alternative to PID. We analyze the quality and compare the performance of PID-based controller vs. a RL-based controller to better understand the suitability for real-world control tasks.

  • Liked Tanay Pant
    keyboard_arrow_down

    Tanay Pant - Machine data: how to handle it better?

    Tanay Pant
    Tanay Pant
    Developer Advocate
    Crate.io
    schedule 3 months ago
    Sold Out!
    45 Mins
    Talk
    Intermediate

    The rise of IoT and smart infrastructure has led to the generation of massive amounts of complex data. Traditional solutions struggle to cope with this shift, leading to a decrease in performance and an increase in cost. In this session, I will talk about time-series data, machine data, the challenges of working with this kind of data, ingestion of this data using data from NYC cabs and running real time queries to visualise the data and gather insights. By the end of this session, you will be able to set up a highly scalable data pipeline for complex time series data with real time query performance.