I work as an quality assurance engineer with a focus on the performance of security appliances/software. I take new code and appliances and run them through a series of test to proof their performance based on different metrics. Once I have completed my tests I write statistical analysis papers which are sent to invested parties for review.
While I work on security architectures my skill set isn't really limited to security. We do a bit of testing to verify usefulness but in the end I could do this job on any sort of networking appliance. Aside from my product driven responsibilities I also have several administrative duties related to the performance team's lab. File severs, databases, web-servers, routers, switches, blade centers, security appliances, and telco equipment all at one time or another fall under my responsibilities.
So in this one job I am a performance QA Engineer, security engineer, network admin, systems admin, db admin, web sever admin, technical document writer and I do statistical analysis. In every case I have varying degrees of expertise and experience. Each of these jobs is, in and of themselves, requires a slightly different skill set and all of them could be the primary responsibility of one single person.
So how did I end up with 7-9 different people's jobs? The answer lies in the focus of my particular group: performance.
Performance is the red headed step child of QA. When a product is exercised by QA they examine many factors of usability, user experience and function. The performance of products are given thresholds. As long as the software doesn't consume above preset thresholds of memory usage, cpu usage, network usage...etc then it was generally considered performance nominal. Very few companies concerned themselves with fine-grained performance tuning software or appliances because these thresholds were set to such extremes. They set minimums and optimums and as long as you were inside that range performance wasn't the developing company's problem. Even now many companies give a passing glance at performance rather than really digging into the data.
So, as expected, I work on a fairly small team that operates as an off shoot of QA. This provides us with many opportunities. In order to keep up with evolving technology we have to acquire cutting edge products and services. We have to explore new and exciting methodologies and practices. My job changes day to day depending on what new information we take in.
In statistical analysis all data is useful. Aberrant data, bad data, misapplied data...it's all clues leading to the answer: why does what I am observing occur? In the same way, every day my job evolves because of some new piece of data. We are [almost] never surprised or caught off guard when new data appears. We strive to, as quickly as possible, examine and integrate the details into our world and not even the most minimal or seemly pointless data goes unnoticed. Even if a product performs as expected we search for why it performs as expected. If I had to list one qualification for the job I do it has to be: attention to detail.
The devil is, as always, in the details.