Julian Harty
Julian Harty

Testing of mobile apps is easy to do poorly, poking the GUI, installing an app on a couple of devices and declaring your results. However, we don’t need to be constrained by mediocrity, instead let’s learn about the foundations of how mobile platforms and development technologies are used to create apps and how these are then interpreted by the devices the apps are installed on so that we know the sorts of bugs and problems that affect many mobile apps i.e. testing techniques that may be generally applicable to most apps. We’ll also investigate the capabilities and tools that are available to developers and those who support mobile apps to harness these tools and the data they provide to refine and improve our testing.

In addition to general mobile testing techniques we’ll investigate ways to help decide what to test next and what might survive in the morass of “won’t be tested”. As we learn more about specific aspects of an app, we can further refine the testing and use various analytics and research to improve our testing. There’s plenty of data available to help us improve the testing and even the development of mobile apps if we choose to collect and use it. Privacy and protection of the users is also key and part of being a trustworthy, professional tester so we’ll touch on these topics and how they’re generally designed and implemented in mobile apps.

 
 

Outline/structure of the Session

A combination of examples and demonstrations using opensource Android apps. Participants are welcome to use and test these apps and/or apps they work with already.

Learning Outcome

Participants will actively practice testing mobile apps, & learn how to use mobile analytics and reviews to help shape and guide your testing.

Target Audience

People who want to practice and improve the ways they test mobile apps

Requirements

The workshop will include hands-on testing and working with the mobile app ecosystems so you can maximise your learning and experiences of this interesting and lively field. Bring your smartphones, tablets and apps, bring your development environment on a laptop, and be prepared to get involved and practice testing while having fun.

In order to thrive, participants should know how to build and deploy their mobile app onto a smartphone before coming to this workshop.

schedule Submitted 2 months ago

Comments Subscribe to Comments

comment Comment on this Proposal

  • Woody Zuill
    Woody Zuill
    Todd Little
    Todd Little
    schedule 2 months ago
    Sold Out!
    480 mins
    Workshop
    Beginner

    Let’s explore the purpose and use of estimates in the management of software development efforts, and consider possible alternatives. Why do we estimate and are we making estimates that are actually useful?  We say we depend on estimates to make important decisions, and yet we’re often disappointed by the results.

    Why are we so challenged at estimation?  Are estimates for cost, time, or effort always needed? Is it possible there are other ways to approach our work?  If we didn’t estimate, how could we still be successful at making businesses successful?

    The default use of an "estimate-driven" approach is pervasive in software development efforts, and while estimates can be useful, it is worthwhile to scrutinize our use of estimates for cost, time, and effort, and to seek better ways to manage software development projects.

    There are a number of things to explore and many questions to ask. For example, do we really need estimates for all the things we are currently using them? Are we getting a reasonable benefit from them? Is it possible to manage software development projects without these estimates, or at least with fewer estimates?  Is there a way to prove that estimates are helping us make good decisions?

    In this session we’ll participate in some interactive information gathering exercises to see if we can gain a shared idea of our current understanding of the purpose and use of estimates.  We will examine the nature of software development projects and explore some real data to shed light on the art and science of software estimation.  Our exploration goal is to see if we can work together to come up with some ideas about improving on the traditional approaches to using estimates.

  • David Laribee
    David Laribee
    schedule 2 months ago
    Sold Out!
    480 mins
    Workshop
    Intermediate

    In the early 2000s, eXtreme Programming (XP) introduced agility to software engineers. Contemporary cultural and technical innovations - container technology, distributed version control systems, the proliferation of free and open source software, and the DevOps movement - have significantly expanded our possibilities.

    In this one day, hands-on workshop, we’ll build a modern continuous deployment pipeline based on Git, Jenkins, and Docker. Starting with continuous integration, we’ll practice Git workflows enabling parallel development with pull requests and explicit dependency management through the use of forked repositories. We’ll then extend the ecosystem to support ad-hoc testing environments, multi-versioned deployments, and build promotion. We’ll survey tools and techniques for production deployments touching on Docker Swarm, Google Kubernetes, ChatOps, and emerging tools used in serverless architectures such as Amazon Lambda.

    While technologies change, values and principles continue to guide our choices. We’ll end with reflection and a guided discussion on how core XP values - simplicity, feedback, communication, courage - can serve as a compass for environmental and workflow decisions that impact our customers and teammates.

  • Scott Ambler
    Scott Ambler
    schedule 2 months ago
    Sold Out!
    480 mins
    Workshop
    Beginner

    Disciplined Agile (DA) is an IT process decision framework for delivering sophisticated agile solutions in the enterprise. It builds on the existing proven practices from agile methods such as Scrum, Extreme Programming (XP), Lean software development, Unified Process, and Agile Modeling to include other aspects necessary for success in the enterprise. DA fills in the gaps left by mainstream methods by providing guidance on how to effectively plan and kickstart complex projects as well as how to apply a full lifecycle approach, with lightweight milestones, effective metrics, and agile governance.

    The one-day workshop is not technical and is suitable for all team members. Many group exercises reinforce the principles learned. The workshop is also valuable for management tasked with moving from traditional approaches to agile.

  • Liked Julian Harty
    keyboard_arrow_down

    Improving Mobile Apps using an Analytical Feedback Approach

    Julian Harty
    Julian Harty
    schedule 2 months ago
    Sold Out!
    45 mins
    Talk
    Intermediate

    There are various ways we can improve the testing of mobile apps. These include:

    • Better testing
    • Test automation
    • Scaling testing
    • Using software tools e.g. static analysis

    For mobile apps we can also incorporate two complementary forms of feedback: analogue - created by humans - such as app store reviews, and digital - built-into the apps - such as mobile analytics. This talk introduces ways we can incorporate both analogue and digital feedback to help us better understand the qualities of our previous work and ways we can improve our work in future so we release better apps while also working more effectively and productively.

    The feedback we receive can help us adapt and react nimbly, reducing the annoyances for users of our apps. We're also able to see beyond our  team's horizon to the rich and varied domains where mobile apps are being used.

  • Liked Julian Harty
    keyboard_arrow_down

    Does software testing need to be this way? Tools, approaches and techniques to test more effectively

    Julian Harty
    Julian Harty
    schedule 2 months ago
    Sold Out!
    45 mins
    Talk
    Advanced

    Software development teams recognise testing is relevant and important. Testers want to add value and do purposeful and meaningful work, however software automation is encroaching and in some cases obviating much of the hand-crafted tests - including some of the 'automated tests' created by teams. As Nicholas Carr says in his book The Glass Cage: "Who needs humans anyway?"

    And yet, humans - people - have much to contribute to crafting excellent software, including testing the software. This presentation investigates:

    • leading automation techniques to understand more of what they can offer us in terms of testing our software.
    • how structured testing techniques can help all testers including "exploratory testers"
    • where analytics can help
    • tools, approaches and techniques to help test more effectively
  • Nate Clinton
    Nate Clinton
    schedule 2 months ago
    Sold Out!
    45 mins
    Keynote
    Intermediate

    Now that we talk to our computers, what new possibilities and tradeoffs (and business models) will emerge? Today's "digital assistants" are relatively weak, but hint at an evolution in our way of living and working and transacting. In this session, we'll explore the current state of conversational interfaces, speculate about the future of the intelligent assistant, and point to how humans will play a role that new world.