I don't know about you, but when I start reading about audio sample rates, and scary numbers with decimal points and symbols like "kHz" start showing up, my brain tries to escape from my skull. Jeez, I'm a musician, not a tech geek (though not for lack of trying).
Unfortunately if we are going to get into audio recording, we should train our brains to stay still long enough for some fundamentals. Just as it is not necessary to understand why our iPhones work in order to operate them, we don't truly need to know what a "kHz" is in order to grasp how it might be important to our recordings. It stands for kilohertz (or 1,000 cycles per second), and all you really need to know is that the music you listen to on your CDs is 44.1 kHz. So however you record your audio in your home recording studio, when it's is finished, it should be 44.1 kHz.
Some folks believe you should record at higher rates, like 88.1 (stay with me!) kHz, converting down to 44.1 at the end. Personally I don't see the point (get it? I made a decimal joke). Yes technically the audio will be "higher definiton" (pardon the video metaphor), but I don't think most folks would be able to tell the difference. Meh, to each their own.
Here is my article on sampling frequency: https://www.homebrewaudio.com/what-is-sampling-frequency/
Here is an article that tries to make the case for always recording at 88.2 kHz: