AU Class
AU Class
class - AU

Forge and Amazon Web Services: A Perfect Match

Share this class

Description

In this class, you’ll learn how Autodesk uses Amazon Web Services (AWS) to power the Forge platform, and how you can use the same services for your own Forge applications. You’ll get hands-on experience showing you how easy it is to get started with Forge using our quick-start guide. We’ll start with a quick overview of the Forge platform and introduction to AWS. After that, we’ll dive deep into the specific steps and configurations used for both a Node.js application and a .NET application. We’ll discuss the architectures, microservices, development environments, and application deployment options, and help you identify the best practices for running a production Forge application on AWS. This class is best for attendees with some software development experience, and everyone must bring their own laptops. At the end of the class, you’ll have built and deployed a demo application to AWS using Forge.

Key Learnings

  • Learn how to build and deploy software using Forge
  • Learn how to easily create your own production-ready cloud environments for Forge applications
  • Learn about the best practices for running and securing applications in the cloud
  • Discover how the cloud helps accelerate change and time to market

Speakers

  • Avatar for Thomas Jones
    Thomas Jones
    Tom “Elvis” Jones is a Solutions Architect with Amazon Web Services who spends his time focusing on the complex challenges of our most strategic partners in the the Design, Engineering, and Manufacturing space. His career has spanned both the hardware and software sides of the house, including work at Red Hat, Transmeta, and Pratt & Whitney, giving Tom an extremely broad technical experience across multiple industries and verticals. He is a whitepaper author, a patent holder, a training material builder, a DevOps expert, an active Maker, a mountain biker, and above all, a passionate technologist. He has been known to go far out of his way for pinball and fondly recalls playing "Adventure" on an ADDS Viewpoint ASCII terminal.
  • Vinod Shukla
    Vinod Shukla is a Partner Solutions Architect with Amazon Web Services. He has over a decade of experience designing and building high-performance, enterprise-grade software systems. As part of the AWS Quick Starts team, he enjoys working with partners to provide technical guidance and assistance in building gold-standard reference deployments that are fully automated, highly available, and secure. He is also an active contributor in the open-source community. Prior to joining Amazon Web Services, Vinod worked as a senior software engineer for Atypon Systems, where he developed and maintained the RightSuite product line. RightSuite is an enterprise access-control and e-commerce solution used by many of the world's largest publishing and media companies.
  • Avatar for Jaime Rosales Duque
    Jaime Rosales Duque
    Jaime Rosales is a Dynamic, accomplished Sr. Developer Advocate with Autodesk, highly regarded for 8+ years of progressive experience in Software Development, Customer Engagement, and Relationship Building for industry leaders. He's part of the team that helps partners/customers create new products and transition to the cloud with the use of Autodesk's new Platform - Forge. He joined Autodesk in 2011 through the acquisition of Horizontal Systems; the company that developed the cloud-based collaboration systems—now known as BIM 360 Glue (the Glue). He was responsible for developing all the add-ins for BIM 360 Glue, using the API's of various AEC desktop products. He is currently empowering customers with the use of Autodesk's Forge platform throughout the world, with hosted events such as Forge Accelerators, AEC Hackathons, VR & AR Hackathons. He has been recently involved in the development of AWS Quick Start to support Forge Applications.
Video Player is loading.
Current Time 0:00
Duration 56:40
Loaded: 0.29%
Stream Type LIVE
Remaining Time 56:40
 
1x
  • Chapters
  • descriptions off, selected
  • en (Main), selected
Transcript

TOM JONES: So everybody should have a little slip of paper with a little hash on it. If you don't, raise your hand, and one of our assistants will come bring you one. But you're going to need that to get into the lab.

AUDIENCE: [INAUDIBLE].

TOM JONES: We got one guy here. Everybody ready? All right, let's do this. So welcome to Forge and Amazon Web Services-- A Perfect Match. My name is Tom Jones. My nickname is Elvis, and both seem to be perfect for being on stage at Las Vegas. I'm a solution architect at Amazon Web Services. Joining me here today is Jaime.

JAIME ROSALES: Hi, so my name, Jaime Rosales. I'm a senior developer advocate for the Autodesk Forge platform. And also we have, today, Vinod.

VINOD SHUKLA: Hi, everyone. My name is Vinod Shukla. I'm also a partner solutions architect at AWS. I've been working with Jaime and Elvis, developing some Quick Starts, and excited to be here to show that to you today.

TOM JONES: Awesome. Super. Thanks guys.

So we got some learning objectives for today. We're going to learn how to build and deploy software using the Autodesk Forge. We're going to learn how to create your own production-ready cloud environments for those Forge applications. We're going to learn about best practices for running that application and securing it, and we're going to take a look at how the cloud can help you accelerate your development process.

So let's talk really quickly about the AWS Quick Start program, and I think that's-- this is you, Vinod. I'll let you talk about it.

VINOD SHUKLA: Sure, thank you. All right, so what is AWS's Quick Start program? So AWS Quick Starts are gold standard reference deployments of key partner technologies and solutions in the AWS cloud.

We give customers a push button way of deploying complex workloads using AWS best practices for security and high availability as well as the best practices for the product you are deploying. So you can think of Quick Starts as next generation white papers. So in addition to having documentation in the form of a architects' diagram and deployment guide, we also give fully automated deployment option using AWS CloudFormation so that not only you can read and learn more about the technology, but you can actually have a working solution that you can go and deploy in your AWS accounts.

So just a little bit talking about the motivation for-- sorry. There you go.

So just talking a little bit into the motivation for why do Quick Starts exist. So if you're building infrastructure in the cloud, and let's say you are start with basic building block, and you are just starting off setting up your network, and you build your virtual private cloud-- a VPC, which is your isolated environment. And you are setting up a network layout-- you set up subnets, then you set up rules for routing, and so on. So if you are doing this manually, these are the steps on the left that you would have to go through. We don't have to read it all here, but roughly, it would be around 100 steps that you have to follow to just set up a building block, which is your network layout in AWS.

So we learn from that, and we saw that lots of customers are doing these repetitive tasks, which is not differentiated, and we could help them make this really quick and easy and also something that is a best practice. So we built out this AWS VPC Quick Start, which if you use that, you get the same layout that you see on the right diagram, but now, you are able to set up your VPC in a recommended way with public subnets, private subnets, and that layering that you need in just three steps. So that's the value proposition of using these Quick Starts. And Quick Starts are very modular, so we can build upon each other. And when we built the Autodesk Quick Start, we were able to reuse some of the components and then build a Forge application Quick Start on AWS, reusing the VPC Quick Start in some of the modules.

When you go to the AWS Quick Start catalog, you see a curated list of over 160 Quick Starts. You can browse, or you can search by the use cases, such as databases, analytics, big data, and so on. You can also search-- so any search for Autodesk, you would see two Quick Starts that we have today. We have one for BIM 360 integration that we released this year, and we have one for Forge application that we built last year. For both the Quick Starts, you get the option of choosing your runtime language, so you can use-- today, you can use Node.JS as well as .NET Core to run your applications.

When you look at the deployment guide for the Quick Starts, you get to see the overview and any cost and services that you need, so the Quick Starts are all open source. They're free. You can take them. You can customize them. When you deploy them on AWS, you pay for the compute costs for the services that you're using.

We talk about the architecture and design considerations when building this, talking about best practices and how we make the workload scale and how we make it secure. We provide step-by-step deployment instructions, so there's-- it's using CloudFormation, which is a templating engine-- templating technology from AWS for defining your infrastructure as code. But we give a lot of configurable options, so you can tune the deployment to what you need. Finally, we have some links out for troubleshooting and anything that you-- like what to do next with it.

So as I said, we use AWS CloudFormation, which is our way of defining infrastructure as code, and you can choose between JSON or YAML options to write those templates. What you get at the end of it is a single launch URL, or a single button deployment where, when you go there, you just fill out a form with all the tuning options that you have. And once you'd submit that, you create the stack. We call a unit of work load a stack.

At the end of the deployment, what you get is the figure on the right. So let's just quickly dive deep into-- just at a high level. We'll be doing the workshop, so I wouldn't go into too much detail. But I want to give you an overview of what you'll be deploying today as a Forge application.

So starting off at the green box, the VPC label, so we create a virtual private cloud, which is your isolated resources in AWS. Then we have a layer for the bastion hosts, which is your way of securely entering your network. So the bastion host are-- you can make them-- you can open them up to your data center, your corporate network infrastructure so that you can securely get in. And then we have a private subnet layer where your actual workload, the Forge instances, will be deployed.

Now, for scalability, let's say your load varies with time, and you want something to cater to that demand. We have set this up an auto scaling group, so as and when you need more instances, new instances will automatically come up and cater to that extra load that you have. So we've set up the Autodesk for the application and auto scaling group, and then to distribute the load, we have elastic load balancing. So it's an application load balancer that we're using, which is our layer 7 or HTTP UPS load balancer.

And then there are some other options, which you can see the icons here. There's a NAT gateway, so if your instances require outbound internet connectivity to, let's say, download software or security patches, we have that. And that's also managed by us, so you don't have to worry about setting up your NAT gateways. You can just use the AWS services.

And then the fourth application, which is at the core of it, resides as an application on the EC2 instances that are in the private subnets. And then there's some other tuning options here, which are more advanced. So if you want to deploy it with your own domain name, you have the option of doing that. We'll skip that in the workshop today because that requires more setup. You have to have your own domain in route 53, but when you use the Quick Start to deploy your web application, your Forge application, you have that option.

And then we'll talk a little bit about how we secured your parameters. The application requires your Forge secret ID for your Forge application, so instead of keeping that on the instance in text file, we use something called a parameter store, which can be used for secure storage of your secrets. So that would be the work that we'll be doing today. In

Addition-- so that gets you started, but let's say you are evolving your application. So you deploy a Forge application, then your requirements change, and you make updates. So how do you make sure that your updated code is deployed? So for that, we build a Code Pipeline.

So Code Pipeline is a Amazon technology to-- that enables you to do continuous integration and delivery and deployment. So the way we start is we have-- all the Quick Starts are open source GitHub repositories. So you can use the first stage in the code-- in the pipeline as is your GitHub source, and also, we also have a second source here, which is your secret configuration.

So it's never a good idea to keep your secrets on GitHub because then it will be in source control forever. So to store your secrets, we are using encrypted S3 bucket, which will contain your Forge credentials. But the source is open source, and it will be on GitHub.

Now, when you take it, you make a fork of the open source Quick Start. You don't have to keep it open. You could make it private if you have to. If you are doing anything that is your IP, feel free to do that. But as an example, we'll take the repo that we have today, and we'll use that.

Before you get to the final deployment stage, you want to make sure that the code you've written is well tested. So the second stage here is a test stage and, we-- the Quick Start team built a tool called TaskCat, which enables you to test CloudFormation templates in multiple regions at-- in one go. So it runs your tests in all the regions, and it generates a report, so you can go in and see, if there's any failure, why was that case, and you can fix it. So the idea is that, before you move on to the final deployment stage, you must test your code. And only if that passes, it moves on to the next stage.

So now, it's very typical, in GitHub workflows, to have a test branch and a production branch. So you will have your development branch where you do all the dev work, and then only when that's good to go, you then merge it to your production branch. So that's our third stage, which is the Git Merge stage. It takes the code from you development branch, and if it passed that test in the previous stage, it will then merge to the master branch.

Once there, we are just getting to the production stage, but there's one more step remaining. So CloudFormation takes assets from S3 bucket. It can take asset from S3 bucket. It can also take from GitHub.

But today, Code Pipeline does not support Git modules, sub modules, and we are using-- in Quick Start, we use Git sub modules for modularity. So for that, we've created a stage, like the fourth stage here, which copies your master branch to a config bucket-- or a code hosting bucket, which is in your S3. So when we go through the workshop, we'll talk about some of the steps.

It's detailed there, but we are using two buckets. One bucket is for storing your secrets, which is encrypted, and it's still secret. The second bucket is storing your code, and that will be the content of the master branch.

So now that you have your code in S3, you're ready to deploy it as a CloudFormation deployment. So that's the fourth and the-- the fifth and the final stage, which is a production deployment, and it takes code from S3. It takes your configuration that is in a different bucket, and then it uses that, too, to make a CloudFormation deployment. So Code Pipeline can either create a new stack if it does not exist yet, or if it already exists, then it will update that stack. So with this development workflow, whenever you check in a new code in your development branch, it will automatically push to your production code at the end in your deployed application.

Now, it is probably a good idea to inject a manual step at the very last stage. Let's say our administrator wants to make sure that the code that is going to go to production is what they want, so at the last stage, we've actually added a manual approval. So it will go to the last stage automatically, but it will wait there.

So you go into the console, and you say Approve. Only then, it will deploy the updates and update your stuff. So that is the workflow we'll look into today. With that, let's get into the workshop.

TOM JONES: Cool. Yeah, go back. Go back one side for second, Vinod.

So Vinod's walked us through the pipeline and the workflow. I just want to give a little bit of background on why we built this, and I failed to do that at the beginning when we were starting the workshop. So we built this-- the three of us have participated in many of the Autodesk Accelerates, where we're sitting down for a week, and we're working with developers who are building Forge applications. And what we found is that they build out their application, and at the end of the week, they do a show and tell. And guess what-- it's still sitting on that developer's laptop.

And the developers-- many of them have come from a background where they're developing for Autodesk desktop products, and so they're used to developing for Windows. And they don't necessarily have all the information or understanding to run through this complete workflow and operate their application in a performant, scalable, highly reliable, secure way when they're done. So we wanted to simplify that, and that's why we built all this stuff.

So essentially, it's the same workflow the developer is building on their laptop, and then they say, all right, I'm going to commit. So now, they've got a source code management system. In this case, it's GitHub, but Code Pipeline is flexible. You can use many different products.

So they commit to GitHub, and it automatic runs through all this stuff for you. If you have additional tests that you want to run-- we're just using TaskCat as an example. But if you had additional tests, unit tests, functional tests-- you could put all of that in the test phase and have those execute either in serial or in parallel.

You can use third party tools to do inspection of your code, and I mean, there's a lot you can do here. But essentially, what we were trying to do is use the automation, so infrastructure as code, to allow you to easily get from your laptop and the app you developed to production and, really, allow you to just focus on your app. Anything else you want to add there?

VINOD SHUKLA: No, that's great.

TOM JONES: OK. All right, so let's talk about that. Thanks, Vinod.

VINOD SHUKLA: You want to take that?

TOM JONES: Which one? This one?

VINOD SHUKLA: Yeah.

TOM JONES: So AWS has over 165 different services today. I'm not going to give you a test on all of them or anything, but I want to highlight one of the ones that we're going to use, and that's Cloud9. Cloud9 is a in-browser IDE, so it's a development environment in your browser. We use that just make the class simple. Of course, you can write CloudFormation and your code in whatever IDE you want, but today, we're doing this just to make it easy for you.

We are also using a thing called Event Engine. So you should have a little slip of paper that has a little hash on it, and we're going to give you the URL for the Event Engine page. It looks like this. You put your hash in down at the bottom, and you hit Accept, and it will launch that environment for you. In that environment, we'll have a Cloud9 IDE that you can then launch and get to.

It also, when we first click it, it'll give you this team dashboard. So the team dashboard has two pieces of information that are important for you for this lab. The first is the AWS console button, which will allow you to launch the AWS Management Console in another browser tab, and that's where-- I'll show you a picture of in a minute, but that's where you can get to the various AWS services. And then the second thing that you're going to be interested in is a ReadMe, and the ReadMe is where you're going to find the instructions for the lab. And I'll show you what those look like here in just a second, as well.

So here's what the management console looks like, and this is the default blank page that you log into when you get in there. Two things I want to call your attention to-- so today, AWS has 22 regions around the globe with 69 availability zones. If you want to know more about what that means, come and see me while the lab's going on, and I'll give you as much detail as you want. But we got a lot of infrastructure. Today, this lab is operating out of our Oregon region, so you want to make sure you stay in Oregon, because if you move it-- if you change regions, the lab's not going to work.

The second thing that you want to know on this page is this search bar that will allow you to find other services like CloudFormation if you want to look at the output of your CloudFormation stacks. And then the step-by-step instructions Vinod has patiently crafted in Markdown and is hosting for you once you click on that ReadMe link. There's actually two clicks. You'll click ReadMe, and then click again, and you should see this.

One last thing about the training-- or the lab material. If you see a message at the top that says, if you're doing this as part of a workshop, please ignore this page and move to the next one. So this is generic lab instructions.

You don't have to do all the steps. If you see that message, just move on. This is not the message you were looking for.

Deploying applications-- so we're going to go through the lab. In one section in the lab, once we get the infrastructure up, we're actually going to deploy a sample Forge application. So these are a couple of applications that Jaime has placed for you. And this is just a little GIF that's running, and it shows once you have your application up, and you navigate to the URL, we're going to download the file that's in the instructions, and then you're going to upload it here. And you should see something like this in the Forge viewer in your application that you are now hosting in AWS.

You don't have to memorize this. It's in the instructions. I just wanted to call it out because it's at the bottom of the page, and if you click on it, it will expand, and it will blow up like this so you can see it more easily.

These are your actual instructions. That's the URL, so go to the URL dashboard.eventengine.run. Enter your hash and then follow the instructions in the ReadMe.

If you've got questions, raise your hand. We'll come around and help you. We've got some lab assistants here in the back. You've got the three of us up here.

And have fun. We hope you enjoy it, and let us know how we did. Any questions? All right.

JAIME ROSALES: And then in the instructions in the lab, it's going to ask you to use your GitHub account. So if you don't have a GitHub account, go ahead and create one, and the same thing for the Forge account. So the Forge account can be reached out at forge.autodesk.com. On the top right corner, you can sign in with your Autodesk ID and create a quick account there to create an app in order to get the Forge client ID and client secret that you're going to be using during the lab. If you have questions about that, and you need help, let us know, and I'll come by and help you out.

TOM JONES: And those are all in the prerequisites in the instructions, but we're here to help.

JAIME ROSALES: So just a quick thing because I saw someone in the other corner trying to do-- log in into an AWS account. You don't need to log in into any AWS account. We're giving you, with the hash, the access to the AWS console completely, so in that way, you don't get billed for any services that you will use today or everything. That's the reason we give you the hash. So if you're trying to sign in into an AWS account, raise your hand, and we can help you and direct your where to go. All right, awesome, I'll be right there.

Oh, my god, sorry. Sorry. Doesn't need a new tab. Sorry. There's Amazon on the laptops.

So for creating the Forge account, you're going to head over to forge.autodesk.com. I'm going to use your thing. Yeah, and then basically, when you create an app, you should be seeing an option to select all the APIs. If you don't have the option of selecting all the APIs, it's because you need to start your subscription. So in order to start your subscription-- and don't worry, we're not going to bill you for anything of this. This gives you a free one year subscription with 100 Cloud Credits and all that stuff, so if you want to later on use a different account, you can also transfer to a different account.

So you're going to go into Forge account details. In this case, Elvis already has a full access, but if you don't have a full access, you will see an option to start the subscription. So you will need to click on this because we have one of the Forge services, which is the model derivative, that only will become available at the time of creating the app if you have a valid subscription.

And then another thing-- when you're creating the app, there is some instructions in order to create the app. But in case you missed it, the callback URL-- we're not going to be using that at the moment. So you can type a dummy callback URL, or you can use the one that we're giving you on the instructions. It's up to you. And if you have questions. About this raise your hand, and I can come by.

TOM JONES: So I've got my own hash here. I'm going to walk through the first steps to this, so I'm going to accept that and log in. And some people have been asking about how they get to AWS. There's a button here right, when you first log in, that's says AWS Console, and if you click on that, it will open another window.

And you have to click it again, so open AWS Console again. And then it should open another tab, and you should see the AWS console. There we go.

AUDIENCE: Could you show them Cloud9?

TOM JONES: Yeah, sure, we'll take a look at Cloud9. So once you open up Cloud9, you'll see that we've already created a Cloud9 interface for you, and there's a button here called Open-- or labeled Open IDE, which I can click on. That will open my Cloud9 environment in my browser.

AUDIENCE: It'll take about a minute.

TOM JONES: Yeah, it'll take about a minute for that to start up. So now, my Cloud9 environment is up. There's a big Welcome window here that takes up most of the screen.

I'm just going to close that. You don't need it. And then you can take this window that's at the bottom-- that's actually your terminal-- and maximize that so you can see what's going on on your machine.

VINOD SHUKLA: Just a quick announcement-- so if you are being asked for a password for when doing something with GitHub, you can use that token that you created. So a few people are getting errors in the update artifact stage. You see the substitutions are empty.

So the way that will happen is that script is-- and you don't have to use the script. What we are doing is we have three files that we are substituting your Forge secrets and email address and IP address in. So the update script is-- update artifact script is using a token that is present in those files, and it's using your value that you provide to replace the string.

Now, if you did not provide the values correctly, and the IP address string requires quotes, what happens is it tried to replace, but the variables are empty. So now, your tokens in the three files that we are replacing in, they become all empty. So if you retry, it's not going to work because the tokens are gone. So what you can do is-- the two options, you can just go to those three files and just paste your email address, your Forge credentials, and your IP address manually. Or you can actually unzip the zip file-- the assets zip file again to replace your file with the right tokens and then try the replacement again using the update artifact script.

And then another question was about the /32. So this is a CIDR block notation that we're using for IP address ranges. So when you-- so the way this IP addresses work is it's like a octet. There are four octets.

So the last part, after the third dot, if you put a number there-- and then let's say your IPS address is 1.2.3.4, if you want to allow a range of IP addresses, if you say /32, then it allows access for only the IP address that you provide. As you reduce that number, if you say /24, then the last part of the IP address allows everything from 0 to 255. So in this example, if you say 1.2.3.4/32, then your basic-- and then if it says /24, then you're allowing access from 1.2.3.0 up until 1.2.3.255. Now, the way we are doing the string substitution, you need a backslash and a quote for the IP address. Otherwise, it will escape-- it will take the slash as Escape for a bash variable, and that could cause problems.

TOM JONES: [INAUDIBLE].

VINOD SHUKLA: Yeah. Let me walk--

TOM JONES: I'm trying to make that bigger.

VINOD SHUKLA: So yeah, let me just talk through again. On the steps, so when you unzip the asset bundle, you see these five files, and we listed them in the document as well. There's a Forge prod CFN, which is a configuration file that your production stack will use. And it contains a few tokens here-- your Forge client ID, your Forge client secret, your key pair, and so on.

So if you want, you can just manually-- let's say you get empty here. You can edit it manually. You don't have to use the update artifact script. Same thing here-- the Code Pipeline JSON file also contains a few tokens, and then there's one TaskCat project override that contains these three tokens-- key pair, email, client ID, and secret.

So what you have to do is, in the update artifacts file, you have to provide your email, provide the client ID, the secret, and then IP address, make sure you keep the quotes here and the backslash, because if you skip that, then that could have caused a problem. Once you run the script what what's expected is that this file would then have your email, your client ID substituted. So if that did not happen, you can just download the bundle again and unzip it, or you can manually update.

So quick time check-- if people have already deployed the Forge prod stack, and if it's in a create complete status, you can go in and look at the outputs. And you can see the application being deployed. Now, if there was an error in your IP address, it's possible that the URL of the app won't work.

Raise your hand. We can walk it through how you can go in and manually and update your security group to fix that. And if you've already deployed the Code Pipeline, make sure you don't deploy the last manual step before you verified your first Forge prod application. We want to show that you have a base app, and then that gets updated after you've done the approval.

TOM JONES: You want to just describe what that looks like?

VINOD SHUKLA: Yeah, sure.

TOM JONES: Just talk to him.

VINOD SHUKLA: Yeah, so when you first-- when your Forge prod stack completes, you would have a application URL that you can go to, and parallelly, because that step took 15 minutes--

TOM JONES: I can show that.

VINOD SHUKLA: Yeah. So that step takes 15 minutes, so you don't have to wait. You can continue building your Code Pipeline, and then the Code Pipeline will, once it starts-- it's also a CloudFormation stack. So when the Code Pipeline stack reaches a create complete state, at that point, it will start executing the pipeline, and it will go through the source stage, the test stage, the Git mode stage, and then it will finally reach the prod stage.

So when it reaches the prod stage, make it wait. Don't approve it immediately. First, go and verify that you're Forge prod stack is in a original state with one version of the app. Once you've verified that, then you can go into the pipeline again and approve the change, and you would see that the change is then propagated, and your stack, the Forge prod stack, will be updated. And once the stack reaches the update complete state, after that, you would see the new application being deployed.

AUDIENCE: So it automatically [INAUDIBLE] prod stage here? You don't have to disable [INAUDIBLE]?

VINOD SHUKLA: Yeah, yeah, it'll automatically stop. This one?

TOM JONES: Yeah, so essentially, what you'll see, once you launch the first prod stack is just the model, and then the update is to add the graphics-- or the graphs and charts on the side there. So that's the new code that you're pushing into your pipeline and having it go and built. And then if you verify that, you can click the Manual Approve button to manually approve that change and have it flow into production.

VINOD SHUKLA: Your GitHub token is on?

AUDIENCE: So is this only going to work with Revit models? Or specific models [INAUDIBLE]?

TOM JONES: Yeah.

AUDIENCE: [INAUDIBLE].

TOM JONES: Right.

VINOD SHUKLA: When you build the Code-- what stack did you--

TOM JONES: So the question is, will this only work for Revit models? So this pipeline is built to work with any Forge application. So it's not dependent specifically on Revit. We're just using the viewer here, and then we've got data that's pulling in.

AUDIENCE: [INAUDIBLE] pipeline. I'm talking about--

VINOD SHUKLA: [INAUDIBLE].

TOM JONES: Oh, this particular code? Yeah, I don't-- that's Jaime's code. You're going to have to ask him. I'm an Amazon expert, not a Forge expert.

[SIDE CONVERSATION]

JAIME ROSALES: OK, guys, so we are getting-- actually, we already passed the finish time. But for those of you that were able to deploy the last part of it, good. If you were able to just deploy the first CFN, good. If you were not able to deploy any of them, still good. Don't worry about it.

This is about learning, and then this material will become available once we get out of the craziness of AU and then also AWS reinvent in a week and a half. I will take care of doing a screen recording with all the steps in order to show you how this thing should work with the steps needed. And you can always reach us out, either Vinod, [INAUDIBLE], or myself, in order to help you out with any other question that you have on how to host your Forge application in AWS, OK?

So thank you again for coming. The material, like I said, will become available. The only thing is that it's going to have to be run on your own AWS account. Unfortunately, we're not going to be able to take care of that cost anymore. But yeah, so thanks again, and I hope you keep enjoying AU. Thank you guys.

VINOD SHUKLA: Thank you so much.

[APPLAUSE]

______
icon-svg-close-thick

Cookie preferences

Your privacy is important to us and so is an optimal experience. To help us customize information and build applications, we collect data about your use of this site.

May we collect and use your data?

Learn more about the Third Party Services we use and our Privacy Statement.

Strictly necessary – required for our site to work and to provide services to you

These cookies allow us to record your preferences or login information, respond to your requests or fulfill items in your shopping cart.

Improve your experience – allows us to show you what is relevant to you

These cookies enable us to provide enhanced functionality and personalization. They may be set by us or by third party providers whose services we use to deliver information and experiences tailored to you. If you do not allow these cookies, some or all of these services may not be available for you.

Customize your advertising – permits us to offer targeted advertising to you

These cookies collect data about you based on your activities and interests in order to show you relevant ads and to track effectiveness. By collecting this data, the ads you see will be more tailored to your interests. If you do not allow these cookies, you will experience less targeted advertising.

icon-svg-close-thick

THIRD PARTY SERVICES

Learn more about the Third-Party Services we use in each category, and how we use the data we collect from you online.

icon-svg-hide-thick

icon-svg-show-thick

Strictly necessary – required for our site to work and to provide services to you

Qualtrics
We use Qualtrics to let you give us feedback via surveys or online forms. You may be randomly selected to participate in a survey, or you can actively decide to give us feedback. We collect data to better understand what actions you took before filling out a survey. This helps us troubleshoot issues you may have experienced. Qualtrics Privacy Policy
Akamai mPulse
We use Akamai mPulse to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Akamai mPulse Privacy Policy
Digital River
We use Digital River to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Digital River Privacy Policy
Dynatrace
We use Dynatrace to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Dynatrace Privacy Policy
Khoros
We use Khoros to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Khoros Privacy Policy
Launch Darkly
We use Launch Darkly to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Launch Darkly Privacy Policy
New Relic
We use New Relic to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. New Relic Privacy Policy
Salesforce Live Agent
We use Salesforce Live Agent to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Salesforce Live Agent Privacy Policy
Wistia
We use Wistia to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Wistia Privacy Policy
Tealium
We use Tealium to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Tealium Privacy Policy
Upsellit
We use Upsellit to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Upsellit Privacy Policy
CJ Affiliates
We use CJ Affiliates to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. CJ Affiliates Privacy Policy
Commission Factory
We use Commission Factory to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Commission Factory Privacy Policy
Google Analytics (Strictly Necessary)
We use Google Analytics (Strictly Necessary) to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Google Analytics (Strictly Necessary) Privacy Policy
Typepad Stats
We use Typepad Stats to collect data about your behaviour on our sites. This may include pages you’ve visited. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our platform to provide the most relevant content. This allows us to enhance your overall user experience. Typepad Stats Privacy Policy
Geo Targetly
We use Geo Targetly to direct website visitors to the most appropriate web page and/or serve tailored content based on their location. Geo Targetly uses the IP address of a website visitor to determine the approximate location of the visitor’s device. This helps ensure that the visitor views content in their (most likely) local language.Geo Targetly Privacy Policy
SpeedCurve
We use SpeedCurve to monitor and measure the performance of your website experience by measuring web page load times as well as the responsiveness of subsequent elements such as images, scripts, and text.SpeedCurve Privacy Policy
Qualified
Qualified is the Autodesk Live Chat agent platform. This platform provides services to allow our customers to communicate in real-time with Autodesk support. We may collect unique ID for specific browser sessions during a chat. Qualified Privacy Policy

icon-svg-hide-thick

icon-svg-show-thick

Improve your experience – allows us to show you what is relevant to you

Google Optimize
We use Google Optimize to test new features on our sites and customize your experience of these features. To do this, we collect behavioral data while you’re on our sites. This data may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, your Autodesk ID, and others. You may experience a different version of our sites based on feature testing, or view personalized content based on your visitor attributes. Google Optimize Privacy Policy
ClickTale
We use ClickTale to better understand where you may encounter difficulties with our sites. We use session recording to help us see how you interact with our sites, including any elements on our pages. Your Personally Identifiable Information is masked and is not collected. ClickTale Privacy Policy
OneSignal
We use OneSignal to deploy digital advertising on sites supported by OneSignal. Ads are based on both OneSignal data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that OneSignal has collected from you. We use the data that we provide to OneSignal to better customize your digital advertising experience and present you with more relevant ads. OneSignal Privacy Policy
Optimizely
We use Optimizely to test new features on our sites and customize your experience of these features. To do this, we collect behavioral data while you’re on our sites. This data may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, your Autodesk ID, and others. You may experience a different version of our sites based on feature testing, or view personalized content based on your visitor attributes. Optimizely Privacy Policy
Amplitude
We use Amplitude to test new features on our sites and customize your experience of these features. To do this, we collect behavioral data while you’re on our sites. This data may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, your Autodesk ID, and others. You may experience a different version of our sites based on feature testing, or view personalized content based on your visitor attributes. Amplitude Privacy Policy
Snowplow
We use Snowplow to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Snowplow Privacy Policy
UserVoice
We use UserVoice to collect data about your behaviour on our sites. This may include pages you’ve visited. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our platform to provide the most relevant content. This allows us to enhance your overall user experience. UserVoice Privacy Policy
Clearbit
Clearbit allows real-time data enrichment to provide a personalized and relevant experience to our customers. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID.Clearbit Privacy Policy
YouTube
YouTube is a video sharing platform which allows users to view and share embedded videos on our websites. YouTube provides viewership metrics on video performance. YouTube Privacy Policy

icon-svg-hide-thick

icon-svg-show-thick

Customize your advertising – permits us to offer targeted advertising to you

Adobe Analytics
We use Adobe Analytics to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Adobe Analytics Privacy Policy
Google Analytics (Web Analytics)
We use Google Analytics (Web Analytics) to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Google Analytics (Web Analytics) Privacy Policy
AdWords
We use AdWords to deploy digital advertising on sites supported by AdWords. Ads are based on both AdWords data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that AdWords has collected from you. We use the data that we provide to AdWords to better customize your digital advertising experience and present you with more relevant ads. AdWords Privacy Policy
Marketo
We use Marketo to send you more timely and relevant email content. To do this, we collect data about your online behavior and your interaction with the emails we send. Data collected may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, email open rates, links clicked, and others. We may combine this data with data collected from other sources to offer you improved sales or customer service experiences, as well as more relevant content based on advanced analytics processing. Marketo Privacy Policy
Doubleclick
We use Doubleclick to deploy digital advertising on sites supported by Doubleclick. Ads are based on both Doubleclick data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Doubleclick has collected from you. We use the data that we provide to Doubleclick to better customize your digital advertising experience and present you with more relevant ads. Doubleclick Privacy Policy
HubSpot
We use HubSpot to send you more timely and relevant email content. To do this, we collect data about your online behavior and your interaction with the emails we send. Data collected may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, email open rates, links clicked, and others. HubSpot Privacy Policy
Twitter
We use Twitter to deploy digital advertising on sites supported by Twitter. Ads are based on both Twitter data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Twitter has collected from you. We use the data that we provide to Twitter to better customize your digital advertising experience and present you with more relevant ads. Twitter Privacy Policy
Facebook
We use Facebook to deploy digital advertising on sites supported by Facebook. Ads are based on both Facebook data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Facebook has collected from you. We use the data that we provide to Facebook to better customize your digital advertising experience and present you with more relevant ads. Facebook Privacy Policy
LinkedIn
We use LinkedIn to deploy digital advertising on sites supported by LinkedIn. Ads are based on both LinkedIn data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that LinkedIn has collected from you. We use the data that we provide to LinkedIn to better customize your digital advertising experience and present you with more relevant ads. LinkedIn Privacy Policy
Yahoo! Japan
We use Yahoo! Japan to deploy digital advertising on sites supported by Yahoo! Japan. Ads are based on both Yahoo! Japan data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Yahoo! Japan has collected from you. We use the data that we provide to Yahoo! Japan to better customize your digital advertising experience and present you with more relevant ads. Yahoo! Japan Privacy Policy
Naver
We use Naver to deploy digital advertising on sites supported by Naver. Ads are based on both Naver data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Naver has collected from you. We use the data that we provide to Naver to better customize your digital advertising experience and present you with more relevant ads. Naver Privacy Policy
Quantcast
We use Quantcast to deploy digital advertising on sites supported by Quantcast. Ads are based on both Quantcast data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Quantcast has collected from you. We use the data that we provide to Quantcast to better customize your digital advertising experience and present you with more relevant ads. Quantcast Privacy Policy
Call Tracking
We use Call Tracking to provide customized phone numbers for our campaigns. This gives you faster access to our agents and helps us more accurately evaluate our performance. We may collect data about your behavior on our sites based on the phone number provided. Call Tracking Privacy Policy
Wunderkind
We use Wunderkind to deploy digital advertising on sites supported by Wunderkind. Ads are based on both Wunderkind data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Wunderkind has collected from you. We use the data that we provide to Wunderkind to better customize your digital advertising experience and present you with more relevant ads. Wunderkind Privacy Policy
ADC Media
We use ADC Media to deploy digital advertising on sites supported by ADC Media. Ads are based on both ADC Media data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that ADC Media has collected from you. We use the data that we provide to ADC Media to better customize your digital advertising experience and present you with more relevant ads. ADC Media Privacy Policy
AgrantSEM
We use AgrantSEM to deploy digital advertising on sites supported by AgrantSEM. Ads are based on both AgrantSEM data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that AgrantSEM has collected from you. We use the data that we provide to AgrantSEM to better customize your digital advertising experience and present you with more relevant ads. AgrantSEM Privacy Policy
Bidtellect
We use Bidtellect to deploy digital advertising on sites supported by Bidtellect. Ads are based on both Bidtellect data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Bidtellect has collected from you. We use the data that we provide to Bidtellect to better customize your digital advertising experience and present you with more relevant ads. Bidtellect Privacy Policy
Bing
We use Bing to deploy digital advertising on sites supported by Bing. Ads are based on both Bing data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Bing has collected from you. We use the data that we provide to Bing to better customize your digital advertising experience and present you with more relevant ads. Bing Privacy Policy
G2Crowd
We use G2Crowd to deploy digital advertising on sites supported by G2Crowd. Ads are based on both G2Crowd data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that G2Crowd has collected from you. We use the data that we provide to G2Crowd to better customize your digital advertising experience and present you with more relevant ads. G2Crowd Privacy Policy
NMPI Display
We use NMPI Display to deploy digital advertising on sites supported by NMPI Display. Ads are based on both NMPI Display data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that NMPI Display has collected from you. We use the data that we provide to NMPI Display to better customize your digital advertising experience and present you with more relevant ads. NMPI Display Privacy Policy
VK
We use VK to deploy digital advertising on sites supported by VK. Ads are based on both VK data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that VK has collected from you. We use the data that we provide to VK to better customize your digital advertising experience and present you with more relevant ads. VK Privacy Policy
Adobe Target
We use Adobe Target to test new features on our sites and customize your experience of these features. To do this, we collect behavioral data while you’re on our sites. This data may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, your Autodesk ID, and others. You may experience a different version of our sites based on feature testing, or view personalized content based on your visitor attributes. Adobe Target Privacy Policy
Google Analytics (Advertising)
We use Google Analytics (Advertising) to deploy digital advertising on sites supported by Google Analytics (Advertising). Ads are based on both Google Analytics (Advertising) data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Google Analytics (Advertising) has collected from you. We use the data that we provide to Google Analytics (Advertising) to better customize your digital advertising experience and present you with more relevant ads. Google Analytics (Advertising) Privacy Policy
Trendkite
We use Trendkite to deploy digital advertising on sites supported by Trendkite. Ads are based on both Trendkite data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Trendkite has collected from you. We use the data that we provide to Trendkite to better customize your digital advertising experience and present you with more relevant ads. Trendkite Privacy Policy
Hotjar
We use Hotjar to deploy digital advertising on sites supported by Hotjar. Ads are based on both Hotjar data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Hotjar has collected from you. We use the data that we provide to Hotjar to better customize your digital advertising experience and present you with more relevant ads. Hotjar Privacy Policy
6 Sense
We use 6 Sense to deploy digital advertising on sites supported by 6 Sense. Ads are based on both 6 Sense data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that 6 Sense has collected from you. We use the data that we provide to 6 Sense to better customize your digital advertising experience and present you with more relevant ads. 6 Sense Privacy Policy
Terminus
We use Terminus to deploy digital advertising on sites supported by Terminus. Ads are based on both Terminus data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Terminus has collected from you. We use the data that we provide to Terminus to better customize your digital advertising experience and present you with more relevant ads. Terminus Privacy Policy
StackAdapt
We use StackAdapt to deploy digital advertising on sites supported by StackAdapt. Ads are based on both StackAdapt data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that StackAdapt has collected from you. We use the data that we provide to StackAdapt to better customize your digital advertising experience and present you with more relevant ads. StackAdapt Privacy Policy
The Trade Desk
We use The Trade Desk to deploy digital advertising on sites supported by The Trade Desk. Ads are based on both The Trade Desk data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that The Trade Desk has collected from you. We use the data that we provide to The Trade Desk to better customize your digital advertising experience and present you with more relevant ads. The Trade Desk Privacy Policy
RollWorks
We use RollWorks to deploy digital advertising on sites supported by RollWorks. Ads are based on both RollWorks data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that RollWorks has collected from you. We use the data that we provide to RollWorks to better customize your digital advertising experience and present you with more relevant ads. RollWorks Privacy Policy

Are you sure you want a less customized experience?

We can access your data only if you select "yes" for the categories on the previous screen. This lets us tailor our marketing so that it's more relevant for you. You can change your settings at any time by visiting our privacy statement

Your experience. Your choice.

We care about your privacy. The data we collect helps us understand how you use our products, what information you might be interested in, and what we can improve to make your engagement with Autodesk more rewarding.

May we collect and use your data to tailor your experience?

Explore the benefits of a customized experience by managing your privacy settings for this site or visit our Privacy Statement to learn more about your options.