I have talked with quite a few people in the energy sector in the last year, and yes, I think the same "pragmatic AI" approach is valid. In energy some of the problems I have encountered are ways to use dynamic capacity....say a
solar system integrated into a building. The complexity in this problem is the existing infrastructure and staff (who may be less receptive to change). By using "off the shelf" frameworks like AWS Sagemaker or Google BigQuery ML results can be created quickly, which gain trust of the people involved in change management.
Additionally, there is a lot going on with edge ML right now. You can see some developing ideas here:
https://github.com/noahgift/managed_ml_systems_and_iot
Using "off the shelf" chips that talk to high level frameworks are a great way to quickly get results. In industries like energy, which have a huge legacy technology base, getting results quickly and making the IT part, "the easy part" is a way of limiting the risk of a project. The opposite approach, and one I don't recommend, is to try to be on the leading edge of cutting edge technology in say, Deep Learning. Limiting the complexity of technology in my opinion is the secret to getting goals accomplished, and this is what is pragmatic.