It’s Day One. Because I missed the seminar I really wanted to attend in the earlier timeslot, I’ve parked myself right outside of Rm 349 in order to attend “DPR05-INT Developers Are from Mars, Testers Are from Venus”.
2:14 PM: I’m the second person into the room, and seat myself near the back to ensure access to a wall power outlet.
2:15 PM: The presenter of the last seminar in this room, “Microsoft Visual Studio 2010 Tips and Tricks” has a line four deep full of people wanting to ask extremely technical questions. He’s looking around in desperation wondering how he can get out of here.
2:17 PM: The crowd up front dissipates. One of the presenters of my seminar asks him how it went. He says, “Fine, just answering question after question… Slides are boring!” I tend to agree, though a seminar doesn’t strike me as the right place to ask how to solve bugs I run into at work. He wishes her to “break a leg!” and leaves in a hurry.
2:20 PM: It’s quiet now, so I continue to work on turning my keynote scribbles into something approaching readable.
2:21 PM: The conference hall’s Internet has been down for more than an hour now. I overhear that supposedly a main trunk is down and it’s actually an issue affecting the entire city.
2:40 PM: The seminar room is half full; maybe testing isn’t a popular topic. We’re told we’ll start in 5 minutes.
2:46 PM: We start by focusing on what’s new with 2010: Manual Testing!
- Your testers can make use of Testing Center in Visual Studio 2010 Test Professional. It is, obviously, an environment completely geared towards recording useful data while running tests on .NET languages.
- Testing Center puts each tester in an instance of a virtual machine (VM). If a bug occurs, the entire VM is filed away into TFS, allowing the developer to re-run steps in the exact environment the tester ran into the bug within.
- Manual Test at its simplest is comprised a of list of defined steps a tester follows. As they complete steps (for the first time), they’re recorded (as raw HTTP requests) as to allow those steps to be sped through automatically on subsequent tests.
- Manual Test Runner: Allows you to fill out multiple sets of data that are automatically iterate through by the program, to quickly test a variety of scenarios.
- Filing a bug: Lots of data is autopopulated, each step even has a video recording associated with it to allow the developer to see a specific timeframe of a test quickly.
- Action log: Has all of obvious, OS data, memory, diskspace, filled out automatically.
- Bug is linked back to the actual test case.
- Pretty much everything but the name of the bug filing is automatically filled out.
- Manual Test has the Tivo of debuggers: replay code step by step, even backwards, and/or view the stack to see what exactly happened.
2:59 PM: We’re done looking at Manual Testing.
3:00 PM: A question from the audience on how much space this will take. The answer: A lot of it, but Test Center’s configuration is highly flexible; always-on video might be good for remote groups in different time zones, but not so useful (or too costly) for others.
Also: IntelliTrace logs get particularly huge, but they thought ahead on this one – one IntelliTrace which reveals multiple bugs can be shared across those bugs.
Geonka steps in to mention that a powertool is imminent to delete old bug attachments (IntelliTraces, video logs).
3:05 PM: People in the seminar room next door are shouting and applause is randomly breaking out. Note to self, check topic name. (I forget to do this.)
3:06 PM: Audience question: What to do if you’ve already got lots of test cases defined in another program? Test case migration tool exists; Export from other test suites to Excel, then from Excel to Test Case Migrator.
3:08 PM: What if you don’t know exactly where a bug started to occur during a test run? They’ve got it covered – you’re able to create an “exploratory bug” and scope a period of time as where the bug possibly occurred.
3:10 PM: An audience question about interoperability with ClearQuest? No interoperability exists with other test suites, or is planned.
3:11 PM: What if you need to collect a log file as part of creating a bug attachment? It’s covered – this can be specified as part of the test case (as part of the failure branch), though a hundredish lines of code might need to be written to accomplish this.
3:12 PM: Are project managers usually a prickly bunch? Three or four questions asked so far have been about customizing interfaces or workflows for project managers who don’t want to install or learn anything new. Can you be an effective project manager if you’re not open to new technology? Isn’t your whole job to manage the creation of new technology?
3:16 PM: Next demo: When the developer receives a bug that’s been filed by the tester, what tools exist to assist him/her?
For starters, the IntelliTrace log is linked to the bug, showing a list of threads that were active down the timeline, and a list of exceptions that were thrown.
Next, and this is a momentous one: The developer can step through (F10) through the trace of the bug and find the exact line of code that triggered it.
3:19 PM: An audience question on reporting. Not much in the way of qualitative reports exist in the project right now, but they’re thinking about adding them for the next version.
3:20 PM: These guys don’t use weasel words; if they don’t have something in their app, they say, “No.” Nice.
3:22 PM: An audience member is using the word “repros” as shorthand for bug “reproductions”. (As in, “The repros might be incomplete,”) There are some hardcore QA people in the midst.
3:24 PM: Just re-enabled wireless and tried to access Gmail. Still no go.
3:25 PM: Audience question: Can Test Center be used by people without TFS access? Not really: Test cases can be exported as a Word document, but external testers (with no TFS access) are on their own when it comes to filing things back to the system.
3:29 PM: Audience question: We will see a standalone program for testing (meaning no install of Visual Studio 2010 on the machine)? Maybe, but not today.
3:30 PM: Last demo of the seminar! The topic: User interface testing (a.k.a. regression testing).
First item of note, we’ll be able to reuse the test cases created manually for this purpose to do this. (To clarify for non-developer types, the ideal is to automate ponderous UI testing as much as possible.)
The automation seems to make heavy use of .NET controls. This means a requirement for the proper use of .NET forms.
A note here: coded UI Tests are run at different layer than the can-be-automated Manual Testing, which uses raw HTTP requests. UI testing here occurrs at the highest (user) layer – it replicates actual UI actions of mouse clicks and keyboard input.
3:48 PM: Audience question: Are code coverage reports available? Yes. UI coverage reports? Not today, but they’re thinking about it for a future release.
3:49 PM: Note: Test Professional is a separate SKU and doesn’t come with every version of Visual Studio 2010.
3:50 PM: Audience question: If your UI changes or a database schema is modified, do your developers have to step in and recode an automated test? Answer: A tool exists called UI Test Writer that will cover 90% of these situations and let allow you to avoid writing code.
3:51 PM: Note: Both C# and VB.NET can be used to create automated test cases.
3:53 PM: Addendum to if UI changes or database schemas are modified: Shared steps allow you to take a, for example, first step of doing a login and plop it into any number of test cases. Later on if that change occurs to the login box, you can easily replace that common step across a whole bunch of test cases.