I'm a little worried that I am relying on AI too much lately and I'm losing critical thinking skills because of it. I've been deferring to it more to help me write sales copy which I think I'm bad at for positioning for looking stuff up and just general rubber ducking a term I know I've used here before but I mean you know like brainstorming and and talking through stuff with it and so I'm trying to figure out if this is normal usage here or if I really am outsourcing the most important stuff. I don't use it for wholesale writing I still need to do that myself. I will talk through an idea if I'm not in a position to write and you know usually that ends up getting transcribed in Super Whisper and summarized somehow but generally I don't feel like that is like ready for writing and so AI definitely doesn't do any writing for me. Sometimes I'll have it flesh out factual stuff or I'll say like I'm stuck here but the thing I'm worried about is instead of trying to think through a problem I am accelerating the thought process by going to AI and that that is bad. You know it's it's a lot like I'll get mad at my kids for walking into a room standing there and saying I can't find it and then leaving without looking for the thing they were actually looking for and I feel like defaulting to going to AI is a lot like that. It's instead of you know thinking for a few minutes about something saying oh I can't figure it out and just outsourcing it to AI. So am I outsourcing my most important stuff to AI? I don't think so. Again the things I said I use it for are and sorry if you can hear my kids in the background I'm recording this on the fourth of July they're home or spending some time outside since it's not oppressively hot but the things that I use it for are brainstorming which means that I have already thought about a thing and now I'm trying to think about other things related to it that I might have missed. General search and questions which that's largely replaced Google for me. I have started to distrust Google's results because their AI summaries have been really bad and so since chat GPT especially includes sources I'll usually default to that and then double check that if I'm not sure but usually I'm just like confirming something or the answer I get is good enough. You know one example of this is I made a chat GPT project called carb checker. You know I've had type 2 diabetes for a couple of years but I'm I'm renewed in my resolve to really manage my blood sugar without medicine or with as little medication as possible and so generally I will use that project to check the carbs fiber and protein in a food and I'll say like look for multiple sources give me the highs and lows and then give me the average and that's generally that combined with my CGM my continuous glucose monitor is generally good enough for most things and so like you know I'm obviously not doing deep research I'm not super duper confirming the sources though I do tell chat GPT to cite its sources there and I have a pretty lengthy prompt that's basically like when I give you a food or list of foods give me the carbs and the glycemic index like how can it affect my blood sugar so it's like stuff like that but when it comes to actually recording chat GPT does not write scripts for me it barely comes up with ideas for me like I'll be like hey come up with 20 podcast episode ideas and I'll think like five of them are good enough and they're really good and I wouldn't have thought of them um and again the general kind of rubber ducking stuff I think the reason that I started to wonder about this is because I'm launching it's a quiz an overwhelmed diagnostic and it wrote the questions and the answers and the scoring and I didn't feel it did such a bad job that I I guess I redid the answers um where I reworded some of the answers and then it came up with like like result pdfs and I did rewrite a bunch of those because it got really repetitive but as far like I didn't come up with the questions for the quiz I just fed it a bunch of stuff about me and what I do and how I help people and client testimonial I guess I gave it a lot of information unless I come up with a quiz and so I I think that got me worried but now that I'm talking through it I don't think I'm outsourcing critical thinking skills however I will be more mindful about how quickly I default to going to AI in the future because I don't want that to be my knee-jerk reaction or my my first thought I want to spend some time thinking about stuff before going to AI let me know what your thoughts are on this I'm really curious to hear how you you're using AI I'm interviewing a guy for the main podcast in a couple of weeks that did like an AI detox where he just decided not to use AI for a whole month and I don't think I would do that but I'm definitely I'm definitely aware that compared to a year ago or even six months ago I am using AI tools a lot more than I was and I want to make sure it's not negatively impacting my work or just me as a person in my ability to think critically so let me know what you think you can send feedback over at streamlinedfeedback.com thanks so much for listening you you
