The columnar database engines are powerful enough to answer the ad-hoc questions so you often don't need to materialize the summary data somewhere else or use BI tools such as Tableau that fetch the data into their server and let you run queries on their platform.
ELT solutions such as Airflow and DBT let you materialize the data on your database with (incremental) materialized views similar to the way how OLAP Cubes work but inside your database and only using SQL. That way, you won't get stuck to vendor-lock issues (looking at you, Tableau and Looker), instead manage the ELT workflow easily using these open-source tools.
These tools target the analysts/data engineers, not the business users though. Your data team needs to model your data, manage the ETL workflow and adopt a BI tool for you. When you want to get a new measure into a summary table, you need to contact the analyst in your company and make him/her change the data model. As someone who is working in this industry, I can say that we still have a way but the BI workflows will be much more efficient in a few years thanks to the columnar databases.
Shameless plug: We're also working for a data platform, you model your data (dimensions, measures, relations, etc.) and build up ad-hoc analytics interfaces for the business users. If the business user wants to optimize a specific set of queries (OLAP cubes), they simply select the dimension/measure pairs and the system automatically creates a DBT model that creates a summary table in your database similar to OLAP cubes thanks to the GROUPING SETS feature in ANSI SQL. Here are some of the public models if you're interested: https://github.com/rakam-io/recipes
ELT solutions such as Airflow and DBT let you materialize the data on your database with (incremental) materialized views similar to the way how OLAP Cubes work but inside your database and only using SQL. That way, you won't get stuck to vendor-lock issues (looking at you, Tableau and Looker), instead manage the ELT workflow easily using these open-source tools.
These tools target the analysts/data engineers, not the business users though. Your data team needs to model your data, manage the ETL workflow and adopt a BI tool for you. When you want to get a new measure into a summary table, you need to contact the analyst in your company and make him/her change the data model. As someone who is working in this industry, I can say that we still have a way but the BI workflows will be much more efficient in a few years thanks to the columnar databases.
Shameless plug: We're also working for a data platform, you model your data (dimensions, measures, relations, etc.) and build up ad-hoc analytics interfaces for the business users. If the business user wants to optimize a specific set of queries (OLAP cubes), they simply select the dimension/measure pairs and the system automatically creates a DBT model that creates a summary table in your database similar to OLAP cubes thanks to the GROUPING SETS feature in ANSI SQL. Here are some of the public models if you're interested: https://github.com/rakam-io/recipes