AU Class
AU Class
class - AU

Optimizing Project Robustness Assessment: A Case Study by Burns & McDonnell

Share this class

Description

Managing user-access control and evaluating project robustness within the Autodesk Construction Cloud hub presents formidable challenges. This case study dives into the innovative strategies devised by Burns & McDonnell, in collaboration with Autodesk Consulting, to confront these obstacles head on. We explore Burns & McDonnell's business requirements and key performance indicators (KPIs), assessing the capabilities of the Autodesk Data Connector and Insight module. The study showcases the development of project robustness assessments using data sets from Data Connector, and dives into the technical intricacies of the ELT process and data pipeline. Emphasis is placed on data engineering processes to curate consolidated tables tailored to customer KPIs. Resulting products include a user-access control tool and robustness scoring systems, transforming raw data into actionable insights. These systems offer detailed intelligence for strategic decision making at Burns & McDonnell.

Key Learnings

  • Learn about streamlining user access control and project robustness assessment processes, reducing time by 70% and 60%, respectively.
  • Learn how to relate users at account and project levels, ensuring data accuracy, and how to score projects based on user data hygiene.
  • Discover projects underusing Autodesk Construction Cloud services, and learn how to customize KPIs for robustness scoring and assess projects' vitality performance.

Speakers

  • Avatar for Stephen Brooke
    Stephen Brooke
    Stephen Brooke is a seasoned Digital Delivery Project Manager at Burns & McDonnell, with a wealth of experience in streamlining project execution through cutting-edge technology. With 17 years of expertise in digital model management, he specializes in integrating design-build and EPC (Engineering, Procurement, and Construction) teams across the lifecycle of complex projects, including aerospace, life sciences, commercial, and consumer product facilities. In his role, Stephen leads the implementation and management of the BIM Execution Plan (BIMxP) program, ensuring project teams are equipped to manage workflows efficiently. He champions the use of virtual and augmented reality (VR/AR) technology to enhance the design-build process, reduce requests for information (RFIs), and improve constructability, while focusing on long-term maintenance needs for owners. A recognized thought leader and experienced speaker at Autodesk University (AU), Stephen has delivered presentations on evolving client data requirements, cloud-based solutions, and the future of project delivery through data and BIM maturity ideologies. His sessions emphasize navigating digital transformation and driving innovation, with a focus on data-driven strategies for sustainable and successful project outcomes. Stephen also excels in improving interoperability between platforms like Revit, Civil 3D, and Plant 3D, sharing his field experience to enhance drawing accuracy and constructability for field teams. He actively trains field superintendents, project managers, and subcontractors to use Autodesk Construction Cloud (ACC) for better collaboration, project tracking, and progress monitoring throughout the construction phase. With a passion for integrating technology with project delivery, Stephen continues to inspire teams to adopt innovative tools that improve efficiency, reliability, and overall project performance.
  • Avatar for Liang Gong
    Liang Gong
    He is a structural engineer by training (PE) with a background in preconstruction/estimating, construction management, BIM/VDC and data science. He helps customers leverage the data they produce through the design and build process to generate actionable insights including forecasting and scalability. He also automates customized workflows with ACC Connect and Autodesk Platform Services. After graduating from Duke University, Liang is currently working on his second master's degree in Applied Data Science at University of Chicago, focusing on AI/ML as a part-time student.
Video Player is loading.
Current Time 0:00
Duration 39:43
Loaded: 0.42%
Stream Type LIVE
Remaining Time 39:43
 
1x
  • Chapters
  • descriptions off, selected
  • en (Main), selected
Transcript

STEPHEN BROOKE: Awesome. Welcome, everybody, to the presentation, class number CS6394, "Optimizing Project Robustness and Assessment," which is a case study by Burns & McDonnell. Liang, can you go to the next slide?

I am Stephen Brooke, and I've got a co-presenter here, Liang Gong, with Autodesk Business Consulting. So we're going to give some speaker introductions first. I'm Stephen Brooke, Digital Delivery Manager at Burns & McDonnell.

A little bit about myself-- I got a Bachelor of Science in Computer-Aided Drafting and Design at the University of Central Missouri, currently work at Burns & McDonnell. Been here almost 18 years now. Some of the titles that I've held in my career here are structural designer, BIM/CAD manager, assistant project manager, VDC manager, and currently, digital delivery manager for our global facilities and life science and technology group.

Some key achievements about myself-- I was the 40 Under 40 Champion of Construction 2023 recipient. I was on the Digital Builder Podcast episode 85, "The Future of Commercial Space Exploration," was on the ENR Podcast, "The Reality of Managing Large Data Sets, Point Clouds, and Best Practices." And I specialize in integrating BIM with cloud platforms like Autodesk Construction Cloud. And a motto I use around the office is driving innovation through technology and collaboration.

I'm going to pass it off to my co-presenter, Liang Gong.

LIANG GONG: Thanks, Stephen. I have a background in civil engineering. And I'm a PE in that. And I used to work for a design build company and specialized in BIM, VDC, BIM estimating. And then I pivoted to the tech sector. Right now, I work as a consultant at Autodesk and focusing on analytics, data science, and automations. Meanwhile, I'm a part-time student at UChicago, specializing in AI and machine learning. I'll give back to Stephen to talk about the agenda today.

STEPHEN BROOKE: Awesome. Thanks, Liang. All right. So for our agenda today, we're going to talk about the background of the problem that Burns & McDonnell had, the current conditions that-- the current state at Burns & McDonnell in relation to optimizing projects, and then some targets, goals, and KPIs that we wanted to achieve, and the implementation strategy of the solution that Liang has worked with us on and put into place for us. And then Liang is going to-- I'm going to pass it off back to him, and he's going to talk about project users, service robustness, and vitality scoring, and then close it out for us on recommendations and talk about predictive analytics.

So why is the AEC industry needing a data strategy to drive improvement? Well, ultimately good decisions are made from good data, and good data comes from a data strategy. And nowadays, with clients starting to be more forefront at trying to improve their models with data and modeling, they're really wanting to focus on putting the I in BIM into their models. And to do that and to get good value out of project performance and even AI, you have to have good data to do that.

So let's talk about some of the industry trends. Industry trends right now are kind of showing that 95.5% of all data goes unused in engineering and construction. 58% of owners have said they've used or planned to use design-build, moving away from traditional design-bid-build ideologies and methodologies. 81% of owners and operators have a desire to drive better decisions from building data. And then companies that utilize big data analytics, whether an owner, client, or even in the AEC space, have shown they've had an increase in revenue by 8%.

So let's talk about the background to the problem and some of the challenges in data management and governance in the Autodesk Construction Cloud that we had started facing. So first, let's understand the ACC ecosystem. So ACC, Autodesk Construction Cloud, as we all know it, is not just a database. It's a comprehensive suite of cloud-based tools that enable us to manage construction, document management, project tracking, collaboration and data analytics. ACC also interacts with databases to store and manage vast amounts of project data.

So why did we need a data dashboard? Well, we wanted to ensure data integrity and compliance. And to do that, we had a governance, a framework that is essential for data integrity, which would enable us to ensure the data reliability, security, and it was being used appropriately. Data inconsistency leads to fragmented decision making, which can cause misinformed decisions and project delays due to disrupted information workflows.

Predictive analytics-- to get predictive analytics, you have to have accurate data, which can be a foundation for predictive models that anticipate risks and optimize resources. Regulatory compliance depends on precise data. Noncompliance due to inaccurate data can lead to fines, project shutdowns, or even data miscommunications amongst your team members. Data transparency fosters trust. Transparent data practices build stakeholder trust by providing clear and actionable insights.

So now let's talk about the current conditions and the current state at Burns & McDonnell. So the current landscape is that we have challenges and opportunities in our current digital environment. Data governance is not being utilized. We are exposing multiple users to ACC and its capabilities, so we have to maintain a proper pacing to, again, facilitate or have an opportunity to mitigate-- change is hard-- and improve reliability, tech fatigue, and promote more forward thinking.

Some issues that we had identified with our projects is that, especially within the ACC environment once we were introduced to it, was that we had too many project admins. We had a lack of utilizing the roles to optimize permissions and access to proper folders and content. And then we had inconsistent company listing.

In other words, we had many of the users that were going in and creating Burns & McDonnell, for example, think we had up to 30 different versions of Burns & McDonnell. And if a user is not familiar with which one to use, they just go and pick one. And we were finding projects that had multiple Burns & McDonnell companies assigned to users, and the permission was focused on one company of Burns & McDonnell. So some users did not have access to the data and content that they needed to have access to.

So then we also looked at utilizing ACC and tracking progress to ensure that the data integrity and project success were going to occur. And it's something that we started trying to do was introduce an idea to one person can spark a widespread utilization and change via willing adopters. There are some folks that we work with from project management to engineers to designers, modelers, detailers. And there are some people who want to do better by improving quality, having more efficiency, a number of things, and we call those tech adopters as well, folks that want to make a change or are willing to make a change. And so we tried to focus on them and leverage them and show them what the potential possibilities were.

And what we started doing was working on our data connector usage, making sure that we're extracting our full hub database on a regular daily basis, and then scheduling refreshes to follow suit behind that extract to be updated within Power BI. And we had a desire for tracking. And we wanted to track certain items such as team performance.

The better data-- is the data being stewarded? Is it consistent? Is it being validated? But also, train on better data optimization, but also train on if we were inadequately using roles and permissions as well. And then we wanted to evaluate training effectiveness and confirm the workflows by focusing in on using that data from this dashboard to help us know where our gaps were in education.

So let's talk about some targets, goals, and KPIs. So some of our targets is to improve administrative robustness and have a higher vitality scoring result to elevate project admin effectiveness. We wanted to have ACC service utilization so we could expand the use of critical tools. So we wanted to know which tools are being used on the projects to make sure that people are willing and able and starting to investigate the tools on their own.

We wanted to find projects that had a low robustness and vitality score so that way, we could correct and help mitigate and manage underperforming projects. And then we wanted to have role optimization to streamline assignments and permissions to make sure, again, the projects were set up correctly and people have the correct permissions.

We wanted to focus on data accuracy. We wanted to ensure that data was complete and accurate and showed all of the necessary project data that we as a company want to see. And then project robustness-- we wanted to show the overall strength of the project by making sure that everything was filled out and operational in the parameters of ACC, and that we were utilizing it to its full potential.

User training engagement was another target we wanted to focus on by increasing training participation. We wanted to take-- if users were using lesser known tools, we would find out via this dashboard, hey, why are you not using it? Oh, we don't know how to use it. And then that would start some dialogue on proper training. And then we wanted to boost efficiency and automation through automated workflows. Again, we wanted to glean from our project data and how our projects are running and optimized to create better workflows to gain even more efficiency.

So let's talk about some of the targets that we had set for ourselves in improving admin user robustness and vitality scoring. We wanted to increase that by 10% over the next quarter. We wanted to enhance utilization of ACC services and increase that usage of key modules by 20% within the next six months. We wanted to reduce our low scoring projects and to decrease those number of projects by 50% over the next quarter. We wanted to optimize our role assignments and permissions and reduce excessive project admins by 30% on projects within the next two quarters.

And then the final four targets that we have, we want to increase data completeness and accuracy. So we want to achieve a 95% data completeness by the end of the fiscal year. And when I say "data completeness," I mean that when we're going into each of these tools, we're filling out all the blanks appropriately, so that way, we're not leaving any field behind. And then we wanted to boost project robustness score by 15% within the next six months.

And then we want to improve user engagement with training programs. So we want 90% of our users to complete training modules that we've established as fundamentals, so to speak, within the next three months. And then the final target that we have is to implement 50% of projects within a year of automated workflows to bring value to our projects.

So let's talk about how we currently implement Power BI at Burns & McDonnell. So our current state is we have a decentralized management profile. We have no central team or department for Power BI workspace report management. We have ongoing efforts to implement governance for data-related activities, including Power BI.

For administrative oversight, we have one or two administrators that oversee general updates and daily monitoring. For user capabilities and workspace management, users can create and manage workspaces within the Power BI tenant. They have full admin rights within their own workspaces. For content creation and sharing, our users can create, publish, and share Power BI apps, dashboards, reports, data sets, and data flows.

Sharing of these Power BI apps and dashboards is both internal and external. And if we were to share it with external users, it's via an Azure account. And for access control, some content is shared company wide, while other content is restricted to specific audiences. Could be based on clients' requirements or security profiles.

So now, I'm going to pass it over to Liang to talk about what he had implemented for us with the dashboard to help monitor and provide health assessments over our ACC and project environments. Liang?

LIANG GONG: Yeah. Thank you so much, Stephen, for the comprehensive illustrations for the business requirements. So in order to match Stephen Burns & McDonnell's business requirements and blueprints, these are the actual implementation steps and how we actually implement and digest those business requirements into reality.

So first of all, it's about the ACC data ecosystem overview. If you are familiar with the evolution of analytics, so these are the steps starting with descriptive, and then to predictive, then prescriptive, and then cognitive, so step by step. So right now, for this initiative, we really focus on descriptive. And also, we are actually under the descriptive focusing on the diagnostic because we're trying to diagnose if each project is acting in a health condition, its robustness and vitality. So what we're doing under this presentation is under descriptive and diagnostic. But I will touch some of the predictive analytics at the end of this presentation.

And if you're familiar with the ACC environment or on a construction project, the different modules like the issues, checklists, forms, they are all in its own silos and the data are living in their silos. However, our final goal is to try to diagnose the project health and try to analyze it and try to visualize it, how to reach from the left side to the right side. It requires a lot of data engineering work and analyze work in the middle. So we also go over this in Power BI to stress how important data engineering is in the process.

You're probably already very familiar with this interface. So I bet a lot of the audience here use Insights module, especially the Data Connector under Insights. A lot of people just click the Run extraction. You could download the ZIP file, which includes the more than 200 CSVs on the right side. And I listed two links here if you're interested in digging more into this data connector.

However, for this initiative, we don't want to manually download the CSVs and sync them or refresh them on a weekly basis. So what we're doing is by leveraging this Microsoft Certified Autodesk Construction Cloud Connector here, by pulling the data automatically from ACC cloud into Power BI and perform data engineering and visualizations in Power BI directly without manually running the extraction on a weekly basis and validate the data, which is very time consuming.

Intuitively, I just want to-- before we talk about the robustness and vitality scoring system, intuitively I want to tell you which data are we trying to scrape from the product UI. So this is a very classic ACC Issues UI. For each issue listed here, all these values, we want to put them in a nicely tabular table. For example, for this issue, we want to scrape the attribute name and attribute value here and which issue type and subtype it is.

How we are achieving that, we just talked about there are over 200 tables in the data connector. We really have to consolidate these tables. These tables are structured normalized tables. We need to figure out a way to consolidate the table into one single table which contains all the values that are listed here, like the Builder's FirstSource, the Punch List, and Punch List for issues have an issue subtype because all these values, as you can see, they're distributed in these five normalized tables already.

But these five normalized tables are not ready to be consumed by the end user, by the data analyst. We have to consolidate them by a data engineer. And I'm going to talk about how we are doing this.

This comes to now our scoring system, robustness and vitality. So by "robustness," it means how clean the data are because garbage in, garbage out. We want the data very clean. We want the data to be very robust. "Vitality" means how actively the users are using different modules in ACC, how big of a vitality it is. That's why we're using the word "vitality" here. So together, it's robustness and vitality scoring system.

For the different scoring systems, we focus on the users, focus on the services, focus on the issues for this presentation. However, say some people are only caring about certain project types. We could extract some of the project types and set up the different measurement, different metrics for those specific project types. That's why I indicated as Classification_1.

Also, once we have the robustness and viltality scoring for issues, we could extend by using the similar logics to other modules like assets, admin, forms, checklists, et cetera. But for this presentation, we're going to focus on the users and services robustness and vitality scoring. And how the scoring systems are associated with each other, they are connected with each other through the project ID. That's why it is a many-to-one relationship between the users and the service and the issues scoring system through the project ID.

So coming to the first one, the users robustness and vitality scoring, from a business requirement perspective, if you open your ACC on account level or on the project level, you see there are a lot of empty spaces. Like on the project level, when you add a user, there is no company associated, there is no role associated, which is really not ideal.

When you are trying to add a user to the project level, you really want to be user to be associated with a company and a role. And this is really better for easier management in the downstream. And also, remember in forms, when you are assigning some of the access levels, sometimes they could be assigned by company or by roles. So we really want these columns to be filled.

How we're doing that to be notified if those columns are filled or not, we'll come back to this business intelligence. So we worked with Burns & McDonnell that came up with these four metrics, these four KPIs, on the users level for different project. We want to know on a project, how many users does it have, project role associated does it have, project company associated, how many users there are for project admins, and how many total users there are.

So for example, for the users that doesn't have a project role associated, if it is only less than four users, we give it a perfect score, like 10. And this weight column is how important each KPI is. If there were between four and six users that are missing project roles, we assign a good score, et cetera. And we convert this Excel logic into DAX into Power BI. This is called business intelligence conversion from Excel to Power BI. So now whenever the data are refreshed in the Power BI, it can be refreshed directly.

And these are the four metrics here. And here, this is the overall score. And for each project, so here, for the projects that are scoring very low-- because the limitation of the page, actually you can scroll to the right side-- you can see exactly which one of the four KPIs are scoring very low for a specific project. And then you could go to that specific project and the product UI to figure out. Imagine you are doing this one by one on the product UI, which, how time consuming it is. However, here it's just one-page report, very time efficient. It saves a lot of manual work.

Just want to touch a little bit on the technical part. I don't want to go too deep in the technical part. However, how we're getting the users consolidated table, it starts with the admin project user services. Why I choose to start with admin project user services? Because whenever you are adding a user to a project, at least the Docs module must be turned on.

So this admin project user services contains all the users that are assigned to a specific project. That's why I started with this table. And then gradually, it will evolve into the admin project user services consolidated, which is used for the scoring system. The logic that we assign the weight to the KPI first-- say the first KPI is the number of users who doesn't have a project company associated is very important to me. On a 1 to 10 scale, I assign it to the 10 score.

And then a define my KPI, number of users with the blank project companies count. How we are doing this? Through the consolidated tables to see how many distinct users there are. And after we define this, the next step is to say how we define the scoring. How do I define-- what is the perfect? What is the good? What is the efficient? What is the poor? And this is the step. I define those based on the KPI's definition. And all these numbers are definitely customizable. Really, based on your customized needs, we could swap these numbers.

The next one is services. So previously, we're looking about the robustness and vitality scoring from the user's perspective. But now, we want to look at the services perspective. For example, on the ACC UI under the Assets module, there's nothing, super blank, which means we're not utilizing the ACC assets modules at all.

So what it's supposed to do is-- we want to know this information very quickly as a QA/QC manager who manages all the projects under the hub. So without going into each project specifically, how I could access this information in an efficient way? This is how. And I'm only giving you an example for assets previously in this slide. But here, Burns & McDonnell has mapped out all the different modules that they're interested in, which includes docs, issues, views, assets, transmittals, submittals, et cetera, and what's their weight, how important they are, what exactly they care about on the [? newer ?] dashboard. This is initial exploration phase. And then we change those into business intelligence.

Similarly, as users robustness and vitality, we mapped out the robustness and vitality scoring for the different services, as you can see here, for example the second one, [INAUDIBLE] issues with root cause. Ideally, we really want each issue has a root cause assigned. We do not want that root cause area for the issue to be blank because we want to analyze this root cause for different issues so that we could avoid the repetitive work in future. That's why this area is very important.

We assigns order from 1 to 10. We assigned 7 important score for this KPI. And this, how we define the perfect score, good score, and deficient score, depends on how many issues does not have a root cause assigned. These are around 20 KPIs. And by translating this Excel into Business Intelligence in DAX in Power BI, we came up with the final visualization for the different projects and their corresponding health score. Basically, health score here representing the robustness and vitality. Instead of speaking these two words, I just use one single word, "health" score here.

As you can see here, for the projects which are very low, we could keep scrolling to the right side to see which specific KPI this project is scoring low. And then I could go to talk to the project manager. I could say why you guys are not using the assets module you're supposed to use? And why all of your issues does not have a root cause assigned? So under one page, it answers all those questions without the effort going back to the product UI, going through each module and figure out each area if it has data assigned or not. This is very time efficient. It saves a lot of energies.

Again, touch a little bit on the technical side, and at first we define the KPI, how important the KPI is by assigning importance score here. And then we define the KPI by writing the Business Intelligence DAX. For example, this one, Project Issues Customer Attribute Values Count, this means how many issues there are that has customer attribute values filled in the project, because you probably come across into this situation that you have a lot of customer attributes, however, none of the issues are filled in with customer attribute values. Then what's the point of you setting up the customer attribute, right? You want the end users to fill in those customer attributes. You want to see those values falls under the customer attribute. So this is what it's doing here after defining the KPI.

And we define what is a good score, what is perfect score, what is bad score. For this specific KPI, I want to make sure the first one, as you can see here, as long as all the issues they have at least one customer attribute values area field, I give it a perfect score. If it's at least more than 80% of all the issues has at least one customer attribute values field I give it a good score. And then et cetera-- between 80% and 50%, deficient score, and then what is a very poor score.

We're talking about, just hypothetically, if we're doing all this work manually, just going through all the projects, checking to each module, how time consuming that would be. And if you're doing this weekly, and how time consuming that would be. So converting those equivalent into time and money, dollar values, this is how much approximately it could be potentially saving the clients.

These are some potential recommendations we are providing to the customers. Always, always use templates whenever possible because it's not only easier for management on the construction side with templates, but also largely impact the downstream on the data side. And test out one project before you roll out this into production because under one project, you could see if there is any duplicated rows, if there is any steps or queries are already wrong. So you have a chance to correct it before you incorporate all the projects which contains large amounts of data.

Some tips about the different blank, null, and the empty in the M language. And understand the relationships between the different modules in ACC, like how you could create an issue from checklists or forms. And also in turn, this dashboard really encourages and motivates the people on the construction side to input more accurate data because garbage in, garbage out. And also, this could be potentially helping the downstream, the life cycle management in the future.

Another big suggestion here that, if you notice that what we're doing here is to pulling the data from ACC cloud into Power BI directly by leveraging the Microsoft Certified Data Connector. However, when your hub is growing too big-- imagine there are 3,000, 4,000, 5,000 projects-- the data are not anonymous, and it's hard to put all those data into Power BI because Power BI is not a database at the end of the day.

That's why I suggest any semantic layer, which could be in SQL, Azure SQL, could be Snowflake, could be on cloud or prem, whatever. And perform the data engineering work here in the SQL database before you bring that consolidated table into Power BI for visualization and for analytics. So this is a big thing you want to think about it before this initiative.

And this is what I tried in Snowflake, compare. So the steps here are the [AUDIO OUT] are equivalent to the steps, the SQL queries in Snowflake. For the same steps in Snowflake, it takes 30 seconds to process. But in Power BI, each step could take 30 minutes. You could see the magnitude here is such huge difference. That's why I suggest you use professional database to perform the data engineering part before we perform any analytics.

So previously, we talked about the evolution of analytics from the descriptive analytics to predictive analytics, right? So I promised that I'm going to talk a little bit about the predictive analytics. Here is the time. The first one is about the predict issues priority level. This is a supervised machine learning algorithm, which is shown here.

As you can see here, based on these eight parameters, we're predicting how important, how urgent a specific issue is on the construction site. So instead of objectively deciding which issue to solve first on the construction site, this algorithm is telling us, by running the [? circular-- ?] I think this is probably a decision tree that is going to tell us which issue is more urgent and which issue should be solved first.

The second one is an unsupervised machine learning, shown here, for the Revit models. You probably have the situation before, that if you have a big Revit model, 1 gigabytes, if it doesn't have a good naming convention, you have to take minutes to open it, you have to explore around to decide its discipline.

However, this unsupervised machine learning algorithm, the clustering, you can see here all the architecture models are clustered together, electrical together, mechanical together, structural together. For the new model, even though you don't know which discipline it is, based on its metadata, based on the parameters of this Revit model, it clustered closer to the structural model. So I could define it as a structural Revit model even before I open it, which saves a lot of time. So this is an example for unsupervised machine learning.

The next one is all about words and texts. So this is associated with natural language processing. You probably know that you put a lot of descriptions under the issue or when you were having a submittal, you write long paragraphs to the architect or the engineers describing your questions, looking for confirmation. All those are valuable assets, those words and those paragraphs, those texts.

Besides on the surface values, we could perform the sentimental analysis on those texts to see which submittal is subjective, if it's very aggressive, if it involves some legal issues, without reading and looking for those keywords, these algorithms, by leveraging natural language processing discipline, we could answer those questions. So do not have your valuable assets just flow away, just treat it as a waste. Don't do that.

The next one is about the ACC photos. This is more involved with artificial neural networks. This is all deep learning. And this is computer vision. For example, this construction worker, and by using SAP-- Segment Anything Package-- I could tell it has a safety hat, the vest, boots. Imagine you have a hundred pictures and you're trying to figure out who's not wearing construction boots. You have to go through the pictures and go through each person in the picture to figure out, by just the raw eye, to see if they are wearing construction boots or not. But by leveraging this algorithm, it's automatically telling you, pointing out to the person who's not wearing construction boots on site, which is very time consuming, and also helps you to abide by the OSHA guidelines.

Another example for computer vision that a lot of people-- we have the construction cameras on site at the same angle, taking a picture every week on Monday. So by looking into consecutive weeks, I could tell if these two weeks-- for example, if week three and week four had the biggest discrepancies compared to other two consecutive weeks. It means it had the biggest construction progress. Again, without manually telling the construction progress, by just looking at the pictures, this algorithm is telling you, actually, how big of construction progress we're having on a weekly basis and its also impact on the project schedules, actually.

Last but not least is about the time series, the pictures here. Some clients are very interested in knowing that how many more tokens they're going to consume in the next 12 months. So instead of a ballpark, just estimating subjectively, by leveraging the past seven years' data, I could tell the next 12-month forecast prediction by running the ARIMA model or SARIMA model, which is indicated here. This is all time series principles. So as long as your data, tabular data, has that timestamp associated with each row, we could perform this like-- how about the laborer's requirements? How many laborers it has been consumed in the past year on a weekly basis? And we could forecast the next month labor consumption.

These are just some examples of supervised, unsupervised [AUDIO OUT] processing, computer vision, time series analytics in the machine learning world. And there are a lot. By the end of the day, we really want to be proactive towards risks and want to lower the insurance premium.

Thanks a lot. That really wraps up the presentation today. If you have any questions towards me or Stephen, please feel free to reach out to us on LinkedIn. We're really happy to answer your questions. And thanks again to my speaking partner, Stephen. Thanks for his time. Thanks for everything.

STEPHEN BROOKE: Awesome. Thanks, Liang. Yes, if you guys have any questions, please reach out to us. Again, the whole idea of this is to provide more ideas and input into how we can all do better in this industry and for our teams and our projects, our clients in the industry as a whole. So thanks again, and enjoy the rest of AU.

LIANG GONG: Awesome. Thanks, Stephen. Thanks, everyone.

______
icon-svg-close-thick

Cookie preferences

Your privacy is important to us and so is an optimal experience. To help us customize information and build applications, we collect data about your use of this site.

May we collect and use your data?

Learn more about the Third Party Services we use and our Privacy Statement.

Strictly necessary – required for our site to work and to provide services to you

These cookies allow us to record your preferences or login information, respond to your requests or fulfill items in your shopping cart.

Improve your experience – allows us to show you what is relevant to you

These cookies enable us to provide enhanced functionality and personalization. They may be set by us or by third party providers whose services we use to deliver information and experiences tailored to you. If you do not allow these cookies, some or all of these services may not be available for you.

Customize your advertising – permits us to offer targeted advertising to you

These cookies collect data about you based on your activities and interests in order to show you relevant ads and to track effectiveness. By collecting this data, the ads you see will be more tailored to your interests. If you do not allow these cookies, you will experience less targeted advertising.

icon-svg-close-thick

THIRD PARTY SERVICES

Learn more about the Third-Party Services we use in each category, and how we use the data we collect from you online.

icon-svg-hide-thick

icon-svg-show-thick

Strictly necessary – required for our site to work and to provide services to you

Qualtrics
We use Qualtrics to let you give us feedback via surveys or online forms. You may be randomly selected to participate in a survey, or you can actively decide to give us feedback. We collect data to better understand what actions you took before filling out a survey. This helps us troubleshoot issues you may have experienced. Qualtrics Privacy Policy
Akamai mPulse
We use Akamai mPulse to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Akamai mPulse Privacy Policy
Digital River
We use Digital River to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Digital River Privacy Policy
Dynatrace
We use Dynatrace to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Dynatrace Privacy Policy
Khoros
We use Khoros to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Khoros Privacy Policy
Launch Darkly
We use Launch Darkly to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Launch Darkly Privacy Policy
New Relic
We use New Relic to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. New Relic Privacy Policy
Salesforce Live Agent
We use Salesforce Live Agent to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Salesforce Live Agent Privacy Policy
Wistia
We use Wistia to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Wistia Privacy Policy
Tealium
We use Tealium to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Tealium Privacy Policy
Upsellit
We use Upsellit to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Upsellit Privacy Policy
CJ Affiliates
We use CJ Affiliates to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. CJ Affiliates Privacy Policy
Commission Factory
We use Commission Factory to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Commission Factory Privacy Policy
Google Analytics (Strictly Necessary)
We use Google Analytics (Strictly Necessary) to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Google Analytics (Strictly Necessary) Privacy Policy
Typepad Stats
We use Typepad Stats to collect data about your behaviour on our sites. This may include pages you’ve visited. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our platform to provide the most relevant content. This allows us to enhance your overall user experience. Typepad Stats Privacy Policy
Geo Targetly
We use Geo Targetly to direct website visitors to the most appropriate web page and/or serve tailored content based on their location. Geo Targetly uses the IP address of a website visitor to determine the approximate location of the visitor’s device. This helps ensure that the visitor views content in their (most likely) local language.Geo Targetly Privacy Policy
SpeedCurve
We use SpeedCurve to monitor and measure the performance of your website experience by measuring web page load times as well as the responsiveness of subsequent elements such as images, scripts, and text.SpeedCurve Privacy Policy
Qualified
Qualified is the Autodesk Live Chat agent platform. This platform provides services to allow our customers to communicate in real-time with Autodesk support. We may collect unique ID for specific browser sessions during a chat. Qualified Privacy Policy

icon-svg-hide-thick

icon-svg-show-thick

Improve your experience – allows us to show you what is relevant to you

Google Optimize
We use Google Optimize to test new features on our sites and customize your experience of these features. To do this, we collect behavioral data while you’re on our sites. This data may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, your Autodesk ID, and others. You may experience a different version of our sites based on feature testing, or view personalized content based on your visitor attributes. Google Optimize Privacy Policy
ClickTale
We use ClickTale to better understand where you may encounter difficulties with our sites. We use session recording to help us see how you interact with our sites, including any elements on our pages. Your Personally Identifiable Information is masked and is not collected. ClickTale Privacy Policy
OneSignal
We use OneSignal to deploy digital advertising on sites supported by OneSignal. Ads are based on both OneSignal data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that OneSignal has collected from you. We use the data that we provide to OneSignal to better customize your digital advertising experience and present you with more relevant ads. OneSignal Privacy Policy
Optimizely
We use Optimizely to test new features on our sites and customize your experience of these features. To do this, we collect behavioral data while you’re on our sites. This data may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, your Autodesk ID, and others. You may experience a different version of our sites based on feature testing, or view personalized content based on your visitor attributes. Optimizely Privacy Policy
Amplitude
We use Amplitude to test new features on our sites and customize your experience of these features. To do this, we collect behavioral data while you’re on our sites. This data may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, your Autodesk ID, and others. You may experience a different version of our sites based on feature testing, or view personalized content based on your visitor attributes. Amplitude Privacy Policy
Snowplow
We use Snowplow to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Snowplow Privacy Policy
UserVoice
We use UserVoice to collect data about your behaviour on our sites. This may include pages you’ve visited. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our platform to provide the most relevant content. This allows us to enhance your overall user experience. UserVoice Privacy Policy
Clearbit
Clearbit allows real-time data enrichment to provide a personalized and relevant experience to our customers. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID.Clearbit Privacy Policy
YouTube
YouTube is a video sharing platform which allows users to view and share embedded videos on our websites. YouTube provides viewership metrics on video performance. YouTube Privacy Policy

icon-svg-hide-thick

icon-svg-show-thick

Customize your advertising – permits us to offer targeted advertising to you

Adobe Analytics
We use Adobe Analytics to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Adobe Analytics Privacy Policy
Google Analytics (Web Analytics)
We use Google Analytics (Web Analytics) to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Google Analytics (Web Analytics) Privacy Policy
AdWords
We use AdWords to deploy digital advertising on sites supported by AdWords. Ads are based on both AdWords data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that AdWords has collected from you. We use the data that we provide to AdWords to better customize your digital advertising experience and present you with more relevant ads. AdWords Privacy Policy
Marketo
We use Marketo to send you more timely and relevant email content. To do this, we collect data about your online behavior and your interaction with the emails we send. Data collected may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, email open rates, links clicked, and others. We may combine this data with data collected from other sources to offer you improved sales or customer service experiences, as well as more relevant content based on advanced analytics processing. Marketo Privacy Policy
Doubleclick
We use Doubleclick to deploy digital advertising on sites supported by Doubleclick. Ads are based on both Doubleclick data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Doubleclick has collected from you. We use the data that we provide to Doubleclick to better customize your digital advertising experience and present you with more relevant ads. Doubleclick Privacy Policy
HubSpot
We use HubSpot to send you more timely and relevant email content. To do this, we collect data about your online behavior and your interaction with the emails we send. Data collected may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, email open rates, links clicked, and others. HubSpot Privacy Policy
Twitter
We use Twitter to deploy digital advertising on sites supported by Twitter. Ads are based on both Twitter data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Twitter has collected from you. We use the data that we provide to Twitter to better customize your digital advertising experience and present you with more relevant ads. Twitter Privacy Policy
Facebook
We use Facebook to deploy digital advertising on sites supported by Facebook. Ads are based on both Facebook data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Facebook has collected from you. We use the data that we provide to Facebook to better customize your digital advertising experience and present you with more relevant ads. Facebook Privacy Policy
LinkedIn
We use LinkedIn to deploy digital advertising on sites supported by LinkedIn. Ads are based on both LinkedIn data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that LinkedIn has collected from you. We use the data that we provide to LinkedIn to better customize your digital advertising experience and present you with more relevant ads. LinkedIn Privacy Policy
Yahoo! Japan
We use Yahoo! Japan to deploy digital advertising on sites supported by Yahoo! Japan. Ads are based on both Yahoo! Japan data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Yahoo! Japan has collected from you. We use the data that we provide to Yahoo! Japan to better customize your digital advertising experience and present you with more relevant ads. Yahoo! Japan Privacy Policy
Naver
We use Naver to deploy digital advertising on sites supported by Naver. Ads are based on both Naver data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Naver has collected from you. We use the data that we provide to Naver to better customize your digital advertising experience and present you with more relevant ads. Naver Privacy Policy
Quantcast
We use Quantcast to deploy digital advertising on sites supported by Quantcast. Ads are based on both Quantcast data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Quantcast has collected from you. We use the data that we provide to Quantcast to better customize your digital advertising experience and present you with more relevant ads. Quantcast Privacy Policy
Call Tracking
We use Call Tracking to provide customized phone numbers for our campaigns. This gives you faster access to our agents and helps us more accurately evaluate our performance. We may collect data about your behavior on our sites based on the phone number provided. Call Tracking Privacy Policy
Wunderkind
We use Wunderkind to deploy digital advertising on sites supported by Wunderkind. Ads are based on both Wunderkind data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Wunderkind has collected from you. We use the data that we provide to Wunderkind to better customize your digital advertising experience and present you with more relevant ads. Wunderkind Privacy Policy
ADC Media
We use ADC Media to deploy digital advertising on sites supported by ADC Media. Ads are based on both ADC Media data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that ADC Media has collected from you. We use the data that we provide to ADC Media to better customize your digital advertising experience and present you with more relevant ads. ADC Media Privacy Policy
AgrantSEM
We use AgrantSEM to deploy digital advertising on sites supported by AgrantSEM. Ads are based on both AgrantSEM data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that AgrantSEM has collected from you. We use the data that we provide to AgrantSEM to better customize your digital advertising experience and present you with more relevant ads. AgrantSEM Privacy Policy
Bidtellect
We use Bidtellect to deploy digital advertising on sites supported by Bidtellect. Ads are based on both Bidtellect data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Bidtellect has collected from you. We use the data that we provide to Bidtellect to better customize your digital advertising experience and present you with more relevant ads. Bidtellect Privacy Policy
Bing
We use Bing to deploy digital advertising on sites supported by Bing. Ads are based on both Bing data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Bing has collected from you. We use the data that we provide to Bing to better customize your digital advertising experience and present you with more relevant ads. Bing Privacy Policy
G2Crowd
We use G2Crowd to deploy digital advertising on sites supported by G2Crowd. Ads are based on both G2Crowd data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that G2Crowd has collected from you. We use the data that we provide to G2Crowd to better customize your digital advertising experience and present you with more relevant ads. G2Crowd Privacy Policy
NMPI Display
We use NMPI Display to deploy digital advertising on sites supported by NMPI Display. Ads are based on both NMPI Display data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that NMPI Display has collected from you. We use the data that we provide to NMPI Display to better customize your digital advertising experience and present you with more relevant ads. NMPI Display Privacy Policy
VK
We use VK to deploy digital advertising on sites supported by VK. Ads are based on both VK data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that VK has collected from you. We use the data that we provide to VK to better customize your digital advertising experience and present you with more relevant ads. VK Privacy Policy
Adobe Target
We use Adobe Target to test new features on our sites and customize your experience of these features. To do this, we collect behavioral data while you’re on our sites. This data may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, your Autodesk ID, and others. You may experience a different version of our sites based on feature testing, or view personalized content based on your visitor attributes. Adobe Target Privacy Policy
Google Analytics (Advertising)
We use Google Analytics (Advertising) to deploy digital advertising on sites supported by Google Analytics (Advertising). Ads are based on both Google Analytics (Advertising) data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Google Analytics (Advertising) has collected from you. We use the data that we provide to Google Analytics (Advertising) to better customize your digital advertising experience and present you with more relevant ads. Google Analytics (Advertising) Privacy Policy
Trendkite
We use Trendkite to deploy digital advertising on sites supported by Trendkite. Ads are based on both Trendkite data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Trendkite has collected from you. We use the data that we provide to Trendkite to better customize your digital advertising experience and present you with more relevant ads. Trendkite Privacy Policy
Hotjar
We use Hotjar to deploy digital advertising on sites supported by Hotjar. Ads are based on both Hotjar data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Hotjar has collected from you. We use the data that we provide to Hotjar to better customize your digital advertising experience and present you with more relevant ads. Hotjar Privacy Policy
6 Sense
We use 6 Sense to deploy digital advertising on sites supported by 6 Sense. Ads are based on both 6 Sense data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that 6 Sense has collected from you. We use the data that we provide to 6 Sense to better customize your digital advertising experience and present you with more relevant ads. 6 Sense Privacy Policy
Terminus
We use Terminus to deploy digital advertising on sites supported by Terminus. Ads are based on both Terminus data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Terminus has collected from you. We use the data that we provide to Terminus to better customize your digital advertising experience and present you with more relevant ads. Terminus Privacy Policy
StackAdapt
We use StackAdapt to deploy digital advertising on sites supported by StackAdapt. Ads are based on both StackAdapt data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that StackAdapt has collected from you. We use the data that we provide to StackAdapt to better customize your digital advertising experience and present you with more relevant ads. StackAdapt Privacy Policy
The Trade Desk
We use The Trade Desk to deploy digital advertising on sites supported by The Trade Desk. Ads are based on both The Trade Desk data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that The Trade Desk has collected from you. We use the data that we provide to The Trade Desk to better customize your digital advertising experience and present you with more relevant ads. The Trade Desk Privacy Policy
RollWorks
We use RollWorks to deploy digital advertising on sites supported by RollWorks. Ads are based on both RollWorks data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that RollWorks has collected from you. We use the data that we provide to RollWorks to better customize your digital advertising experience and present you with more relevant ads. RollWorks Privacy Policy

Are you sure you want a less customized experience?

We can access your data only if you select "yes" for the categories on the previous screen. This lets us tailor our marketing so that it's more relevant for you. You can change your settings at any time by visiting our privacy statement

Your experience. Your choice.

We care about your privacy. The data we collect helps us understand how you use our products, what information you might be interested in, and what we can improve to make your engagement with Autodesk more rewarding.

May we collect and use your data to tailor your experience?

Explore the benefits of a customized experience by managing your privacy settings for this site or visit our Privacy Statement to learn more about your options.