Search a title or topic

Over 20 million podcasts, powered by 

Player FM logo
Artwork

Content provided by Voice of the DBA. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Voice of the DBA or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://podcastplayer.com/legal.
Player FM - Podcast App
Go offline with the Player FM app!

Be Wary of Data

 
Share
 

Manage episode 515050932 series 2334400
Content provided by Voice of the DBA. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Voice of the DBA or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://podcastplayer.com/legal.

I fly a lot, as you might have guessed if you read my blog regularly. In 2025, I’ve been on 56 United planes as I write this, with about 10 left to go before the end of the year. One of the things United does is sometimes send out a quick “survey” after a flight, checking to see if everything went smoothly. I don’t always fill these out, but recently I decided to give some feedback as I had a great experience.

I really wanted to just complement the onboard crew, but the survey was quite a few pages (10?) and a lot of questions. I started to try and fill it out, but lost focus after a few pages. This felt like a chore, and I started to just randomly click some of the selections asking me to rate things 1-10. I wasn’t really rating the items; I was trying to get done. Eventually, I bailed on the survey and didn’t complete it, but that got me thinking about the data from these surveys.

I’m somewhat detail-oriented and I try to do a good job, but I couldn’t finish the survey. How many others just click through things and don’t really give an accurate picture of their feelings?

A similar situation occurs at work, where we have an HR rating system (Thymometrics), which I really like. Over time, it helps me to keep an eye on how I feel about my job, the company, and my general attitude about work. We get quarterly reminders to fill this out, but I know quite a few people who don’t fill it out at all, or just click on it and save the ratings without thinking about them. Another place data might be suspect.

At work we get feedback on various product metrics, in addition to uninstall feedback and product feedback, sometimes with a rating that people click. Is that what they really think about their experience or did they just click the first thing they saw? Or did they mis-click the wrong thing, and they can’t change their rating (clicking 2 when they meant 9).

There is a lot of data that organizations collect from people that is very subjective. Across a large group of users, this should provide some sort of indication of how people feel, but if the sample sizes are small, can you really use this data? I think it’s easy for people in product management, marketing, and sales to view this data as much more accurate than it might be. I know I’m always wary of any outliers when I see feedback, and often I want to know how many people contributed.

Unless it’s a decently large number (100s at least) and there is a clear trend from many people (> 5%), I tend to discount the data as an outlier and not representative.

I’m not sure how many of you do this, but critically examine data and be wary of drawing conclusions. Especially when you are getting impressions, feelings, and opinions from others.

Steve Jones

Listen to the podcast at Libsyn, Spotify, or iTunes.

Note, podcasts are only available for a limited time online.

  continue reading

19 episodes

Artwork

Be Wary of Data

Voice of the DBA

13 subscribers

published

iconShare
 
Manage episode 515050932 series 2334400
Content provided by Voice of the DBA. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Voice of the DBA or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://podcastplayer.com/legal.

I fly a lot, as you might have guessed if you read my blog regularly. In 2025, I’ve been on 56 United planes as I write this, with about 10 left to go before the end of the year. One of the things United does is sometimes send out a quick “survey” after a flight, checking to see if everything went smoothly. I don’t always fill these out, but recently I decided to give some feedback as I had a great experience.

I really wanted to just complement the onboard crew, but the survey was quite a few pages (10?) and a lot of questions. I started to try and fill it out, but lost focus after a few pages. This felt like a chore, and I started to just randomly click some of the selections asking me to rate things 1-10. I wasn’t really rating the items; I was trying to get done. Eventually, I bailed on the survey and didn’t complete it, but that got me thinking about the data from these surveys.

I’m somewhat detail-oriented and I try to do a good job, but I couldn’t finish the survey. How many others just click through things and don’t really give an accurate picture of their feelings?

A similar situation occurs at work, where we have an HR rating system (Thymometrics), which I really like. Over time, it helps me to keep an eye on how I feel about my job, the company, and my general attitude about work. We get quarterly reminders to fill this out, but I know quite a few people who don’t fill it out at all, or just click on it and save the ratings without thinking about them. Another place data might be suspect.

At work we get feedback on various product metrics, in addition to uninstall feedback and product feedback, sometimes with a rating that people click. Is that what they really think about their experience or did they just click the first thing they saw? Or did they mis-click the wrong thing, and they can’t change their rating (clicking 2 when they meant 9).

There is a lot of data that organizations collect from people that is very subjective. Across a large group of users, this should provide some sort of indication of how people feel, but if the sample sizes are small, can you really use this data? I think it’s easy for people in product management, marketing, and sales to view this data as much more accurate than it might be. I know I’m always wary of any outliers when I see feedback, and often I want to know how many people contributed.

Unless it’s a decently large number (100s at least) and there is a clear trend from many people (> 5%), I tend to discount the data as an outlier and not representative.

I’m not sure how many of you do this, but critically examine data and be wary of drawing conclusions. Especially when you are getting impressions, feelings, and opinions from others.

Steve Jones

Listen to the podcast at Libsyn, Spotify, or iTunes.

Note, podcasts are only available for a limited time online.

  continue reading

19 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Copyright 2025 | Privacy Policy | Terms of Service | | Copyright
Listen to this show while you explore
Play