The cloud is a huge paradigm shift compared to your usual software development cycles. Imagine for a minute you have a requirement to build an API that customers would pull data from. You would fire up your favorite IDE, and build say a Swagger UI, and then deploy on whatever web server you already have. Rules are set in the software and there may be some custom validation and so on. You would set up an database where you already have other databases, and using file systems that are shared with others. Some of those choices you've made may not be the best for the project, e.g. RDBMS vs NoSQL, file system vs other kinds of storage, and so on. But you use what others already has anyway to head off the political waters and finance folks and deliver a solution for the customers as quickly as possible. Oh, right, you'd have to add logging -- again, you might write info directly to the database since you don't have the ability to create a new ElasticSearch server on a whim. Again, you do not want to set up another server in fear making a mistake of overprovisioning the eco-system if the API isn't as popular as you thought it would be.
Well, with the cloud in the picture, you'd build a FaaS (Function as a Service) of said methods. The functions would be stored in a compute module that would only cost money when executed, logging would be inherently built in and plus searchable; the choice of databases is yours -- we can even try NoSQL for the heck of it. Your cloud admins (or you?) would only need to config firewalls and route tables and get insta-feedback on whether it worked. Next, you'd set up an API gateway at the front, again, charged only when it's used... the kicker is this: if the primary developer for this leaves the company, the scaffolding and rules are already written in stone, probably in a big JSON config file. So when you hire a replacement that dev instantly knows what's going on. Shortens the learning curve drastically. This would get the stakeholders (and shareholders) excited because the time to market new features and bugs fixes would be considerably shorter.
With the cloud, you would know EXACTLY how much it costs to serve customers. Bear with me and let me explain what I mean. Let's go back to old school systems... you have a server farm with a bunch 'o servers; they're all set to go with whatever you throw at it. You pay X amount for maintenance and licenses and people to manage them. Costs are relatively reasonable -- you've been spending money on those all these years so it doesn't hurt the bottom line. One day, some bean counter comes along and tries to calculate the technical costs involved to service external customers. Turns out they cannot because we likely do not have the ability to distinguish the difference of compute times between internal and external customers. Everyone might be already sharing the same infrastructure, databases, file systems and so on. With the cloud, you can easily build a module that services customers and another module that service internal customers. Internal customers would likely be using a different service anyway since they need to work with raw-ish data, while customers would only access the presentation layer. Thus, it's easier to distingush the difference in compute times, hence you may price your products accordingly with cloud costs to go along with the appropriate profit margins.
Your cloud providers are in this to make money (didja see those stock prices?!), and at the same time, they have an incentive to get rid of your precious infrastructure, thus saving you money. I know it's weird, especially since we're trying to find the right balance (and trust!) between moving things away from in-house/on-premise to the cloud. There are always pros and cons, but I suspect the pros will be more palatable as cloud providers add more features. I hope this explains why I believe the cloud is crucially important in the premise of costs and flexibility in the development process. Take the time to understand the cloud paradigms and it'll help you in your career trajectory.