This blog has now moved to become part of our Forge website. Please change your bookmark to:
We'll be posting the same types of programming tips and tricks in the new blog - along with additional items, such as events, case studies and Forge product roadmap updates.
See you over on the new site.
(No more posts will be published on this site, but we'll keep it active for a while to give everyone time to re-bookmark to the new site)
By Augusto Goncalves (@augustomaia)
Appharbor: Where .NET apps grow and prosper. That's their slogan, but I like to think them as a "Heroku for ASP.NET". And they are that easy and have a free tier!
This post shows some basic steps on how to deploy a Github code (one of our samples, for instance), but can also be directly connected to another git. The very first step is to create an account. When properly registered, go to "Your applications" on the top menu, then "Create new application".
Next, on getting started, select Github deploy. The next page will show a list of all your repositories. You should select a .NET sample, if you don't have one, fork one of our ASP.NET samples.
Finally, we need to setup the Forge Client ID & Secret (assuming you already have a Forge Developer account with an app created). Go to "Configuration variables" section, then click on "New configuration variable" and add both FORGE_CLIENT_ID and FORGE_CLIENT_SECRET values. The image below show how it should look like when both variables are created.
Many of our samples will handle with upload of files, which needs to be uploaded to the app server (in this case, Appharbor) before the app can upload again to Autodesk servers. For that, go to "Settings" section and "Enable File System Write Access". Best practice is to upload files to /App_Data folder. Note: an app should not allow end-user to upload directly to Autodesk as this requires an access token with write capabilities, and if the end-user have such a token, this is a security breach.
As a result, under your Github account settings, you should see Appharbor as an "Authorized application".
That's it. Now when you commit changes to the repository the Appharbor app will deploy and rebuild.
by Jaime Rosales (@afrojme)
For quite sometime I've been playing with cURL and the Forge Platform, and to start this 2017 I will be posting about my research using cURL and JQ processor together with the Forge Platform.
Using Terminal I found a quicker and simpler way for myself to use different workflows when using the Authentication API, such actions as obtaining 2 legged & 3 legged access tokens. With the Data Management API, I've been able to create, upload, request translation, access to Hubs and add new files to specific projects that I'm part of. Everything from the Terminal using cURL. But I know what you can be thinking "cURL from terminal, is so unorganized, so hard to read, so easy to mess up." I could not agree more with some of these thoughts. When your response has more than 1 line of JSON data returned, I agree it can be messy and hard to read.
I know others have decided to use REST Apps such as Paw or Postman, since the fear of messing up one character in your cURL can give you problems and at the same time, you get a better organized JSON result. But bare with me, I found out about JQ while using cURL and since then It has made quite a difference when testing the api's.
What is JQ? jq is a lightweight and flexible command-line JSON processor. A jq program is a “filter”: it takes an input, and produces an output. There are a lot of builtin filters for extracting a particular field of an object, or converting a number to a string, or various other standard tasks. It lets you visualize the JSON response in a organized way and it becomes easier to read when using terminal. How to use it? It requires a basic installation, JQ can be download for different OS platforms from here. After installation has been performed you should restart your Terminal and you would be ready for testing. You can check your version of JQ by simply typing "jq --version" in your terminal, which will assure you JQ was successfully installed.
Let's look at how the previous cURL actions to obtain a 2 legged access token and create a bucket looks after using JQ from Terminal.
As we can see JQ structures and color codes the result JSON from our REST call using cURL. Later on I will be posting the entire workflow on how to start from obtaining a 2 legged access token up to translating a file and get back the URN ready to be displayed in the Viewer. Followed by a 3rd post on how to access my a360 hubs and add a file to my project or another project I'm part of, all using cURL and the JQ processor.
Thank you for reading.
Question:
After a translating job completes by POST job and Get Manifest, the manifest is ready. Now when I tried to GET Metadata (:urn/metadata/:guid), it will take time until is is prepared. That is why the help talks about 202 status code: Request accepted but processing not complete. Call this endpoint iteratively until a 200 is returned.
My confusion is: I think since translating job has translated the source model to a format (say svf), those dataset of metadata should have been ready, why it still needs to start a separate process?
Answer:
The metadata request is separate from the SVF/LMV translation. This metadata is hidden/inside the svf and property db files but not yet generated in SVF/LMV translation. Generally speaking, the metadata is a hierarchy of svf/model. So unpack and parse the svf is needed.
probably you have know the sample https://extract.autodesk.io. It is to extract all dataset of an SVF from *svf file and download them one by one for local deployment. It shows the workflow of unpacking *svf and get the manifest, then the project downloads the relevant file one by one. Those lines are at line 218-220 in current version That more or less might also explain how the formal endpoint (...metadata) is working.
var pack =new zip (success, { base64: false, checkCRC32: true }) ;
success =pack.files ['manifest.json'].asNodeBuffer () ;
manifest =JSON.parse (success.toString ('utf8')) ;
In short, the separate process is current design. your code will need to set a valid checking on if the metadata is prepared or not.
By Adam Nagy (@AdamTheNagy)
Two of my colleagues and I will be attending IoT Tech Expo in London next Monday and Tuesday (23-24 January, 2017). We'll have a booth there where we'll show how Forge can be used in the IoT field, and also take part in panel discussions.
If you are attending the expo then do drop by to say hello :)
By Augusto Goncalves (@augustomaia)
Cache is a common source of problems during web development. It's not unusual to solve some mysterious problems by cleaning the browser history (or just the pages stored locally). So why not do our localhost debug in incognito/private mode?
To make it the default option on Visual Studio is quite simple: on the "Run" toolbar button, click on the dropdown arrow, then select "Browse With..."
Next, on select your browser. For Chrome add the --incognito argument, or -private-window for Firefox or -private for Edge. Add a "Friendly name" such as "Chrome Incognito" and click "Set as Default" to make it even easier.
That's it. Now when you close the browser and restart everything is cleaned and you should avoid cache problems during development.
Once your app is published, if you use a HTTPS, it also avoids cache as there is no local copy of files.
By Stephen Preston (@_stephenpreston)
The Autodesk Forge DevCon conference, originally scheduled for June 27-28 in San Francisco, will now be held on November 13-14 in Las Vegas.
One time. One location. Two great conferences.
We received feedback from many of last year's Forge DevCon attendees that it was hard to travel to two Autodesk conferences in a year. This change will allow you to attend Forge DevCon and Autodesk University in Las Vegas in a single trip. There will be DevCon-only, AU-only, and AU+DevCon ticketing options available.
Because the conference will now be held five months later, we have postponed the Call For Proposals and attendee registration.
New Date. New Location. Same Great Experience.
Last year, our first Forge DevCon offered two-days of the most up-to-date training, technologies, news, and insights around design, simulation, reality capture, AR, VR, and IoT. It was an indispensable experience for anyone working with the Forge platform.
At this year's event you can expect more of the same. Only more so.
Stay tuned for updates and please visit the Forge DevCon 2017 site to learn more.
If you have any questions please don't hesitate to contact us at forge.devcon@autodesk.com.