“There may be people that have more talent than you, but theres no excuse for anyone to work harder than you do.” - Derek Jeter.
Getting training in order to acquire the skills you need to do your job as well as to grow your skillset to open your options career-wise is a key consideration for anyone working in I.T.
And as important as that is, a lot of people fall by the wayside in making a concerted effort in this regard. A lot of people in I.T. leave it up to the employer to find, budget for and schedule their training/education. This is usually from a performance-based initiative to "work with you and get you, the employee, the skills you desire and need to move into your best and most fulfilling role in the company...". This is the usual spiel.
But the reality is far from this ideal performance-review puff-piece. There's either no training or course to put you on for several years, and the ones you may get lucky enough to go to will be irrelevant to you or the company (e.g. I went on a vmware NSX course where the company didn't use NSX, so we couldn't take those skills back anywhere useful for the company).
While a valid argument exists that the company should provide the 'on the job training and upskilling' the position promised all those years ago - do you really want to leave your future in the companys hands? It's your choice. For me personally, and because learning about how to use new Technology is fun for me, I would rather take responsiblity of my own skillsets, and align what's useful for me in my roles (devops thinking, SRE toolchains, relevant languages) with what education resource that are available.
First port of call for me when just wanting to dabble in a new technology, or get a quick 'learn-by-building' feel for something is tutorials. There's no "one resource" for tutorials, I literally go to Google and search something like "build a Docker based 3-tier application", or "Terraform with secure secrets tutorial", or "build simple jenkins pipelines".
The only point of tutorials, for me, is a quick sampling of what the real-deal bits & pieces involved in "doing something" with that particular technology. For example is there a big setup first before you can get anywhere useful i.e. lots of things to install, configuration files to populate and set etc. And then what's the hassle of taking a simple project from A though F.
As you work though it, a lot of times in my experience, the tutorial doesn't work - because its a little out of date, or the system I'm building it on is slightly different and the versions and or syntax of commands is supported, or is missing etc. This is where you go off script to the vendors documentation and start reading API's and user guides to fix your little bit of difference. And now you can get a taste of the vendors documentation - is it easy to follow & understand? Are you able to quickly scan, understand and fix your issue?
So following tutorials allows you to taste the tech, as well as get to know the vendor and their documentation and support methods and style.
I wasn't always an advocate for paying for things on the Internet because I think you can always find the information you're looking for online. Somewhere, someone has documented it and it's free so why should you pay someone for their online course when Terry from www.geociticies.com/~gotknawlege_terry01/ who has spent all of 7 hours reading up on things and can provide you with a somewhat accurate, possibly inaccurate lesson on the topic.
But with the sheer scale and volume of INFORMATION that's available now, it is hard to even fathom the quality information that will educate and enrich you for your time, and everything else that ends up being time-consuming because it's inaccurate and wrong. You spend a lot of time needlessly troubleshooting dead-ends and ultimately teaches you nothing you can have much confidence in other than increasing your troubleshooting skills.
Pay for a good course online. It's not expensive, the material has been carefully put together with clear explanations from industry experts who have and hold qualifications. A good course will save you time, hassle and learning headaches - and let's be honest, if you're running a busy life keeping up with work demands in I.T., time and hassle are things you could do with not wasting.
Bonus extra with the good online courses are they can be geared towards certification - so you can turn all those hours of learning things, into knowledge that can work towards you sitting an exam in that topic to get a cert!
I read someone's post the other day (this person has a PhD in Computer Science) about the Degree vs Certification debate and he gave a great answer. He said you would do both because they have different purposes (I'm paraphrasing). And I agree (note: This person ALSO had a number of industry certs so was pretty qualified to speak from both sides of the argument.).
I used to be of the argument that University was useless for the person who had the server admin skills and who just wanted to get out there and kick a$$. That a degree in Computer Science was for academics who wanted to write better data compression algorithms not for the lone ops guys looking after 800+ unix servers.
But that was me not understanding the broader scope of what different learning was about. I've since been to Uni and completed a bachelors degree in something that wasn't even computer related (I did a BCom in Management & International Business).
What I learned from my time at University was actually priceless. I learned and improved how I read and disseminate information from lots of different sources; To better critique different arguments and produce sound arguments of my own; To think critically and quickly. These may not sound "technical", but they very much applied to all aspects of my career in I.T.
University teaches you a way of thinking through things, thinking about things, and also some reading and writing for academic purposes.
Industry certs are something I also have. They're not always easy to get, due to having to specialist on one topic from one vendor and "go deep". But in the same way that University study disciplines you against a specific method of knowledge and learning, so too does studying for a industry cert discipline you against the things you need to know and demonstrate in order to pass that certification.
Actual usefulness on the job is subject to your particular on-the-job experience, but I can't say that having a deep knowledge of something that you work with daily, even if you don't ever flex as far as the exam required you to, can be a bad thing for yourself, or the company.
Verdict: (if you can) Do BOTH.
Wow this got long quick. But the main takeaways I really want you to have from this topic are
- Your education is YOUR responsibility - not your employers. At the end of the day whether you can get another job, or get a job doing something you like, is up to you, so if you don't do anything about it, your company doesn't have a mandate to do this for you.
- There's zero excuses to not learn. With the internet providing EVERYTYHING you could possibly ever need to learn for any job in I.T. from free tutorials to very affordable online courses - there is zero excuse to not empower an equip yourself with everything you need to succeed.
Invest in yourself and take control of where you want your career to go, sure there's going to be "No"'s in everyone's future job wise, but at least if you know how to build and fix things, you actually have those skills and that cert, eventually those factors will be undeniable. Good luck.