Enterprise Accelerator Technical Resources
Your Enterprise Accelerator (EA) Resources include API documentation, Quickstart guides and Notebooks to get you up and running.
Technical Resources
Install the Python Client
The Enterprise Accelerator currently leverages the DL Python API’s for building and scaling pipelines. To get started using the EA you should install the Python client and run a quick test to make sure everything’s working.
- Take a look at our best practices for managing your development environment
- Install the Python client
- Authenticate with the Platform and test the connection
Enterprise Accelerator Notebooks
The notebooks in this Github repo demonstrate how to use the Enterprise Accelerator (EA), building from basic concepts and API usage to creating a simple web application that utilizes EA services. To get started, clone this repository locally and run through the Jupyter notebooks in the notebooks/ directory.
-
- Introduction to Enterprise Accelerator (EA)
- Scalable data pipelines
- Deploying a model in tasks using time series
- Deploying same model in a flask app (to simulate what the code might look like in a customer production pipeline)
Enterprise Accelerator coming soon to Amazon Web Services (AWS)
Python APIs
The Enterprise Accelerator (EA) grants you access to the following Descartes Labs Quick Start Guides and Python APIs/services:
- Scenes - The DL Scenes API allows users to seamlessly access over 100 curated raster data products from the DL Data Catalog. With this API you can filter imagery metadata and pull stacks of imagery with a few lines of code.
- Catalog - The Catalog is both a Python API and a web UI for exploring and uploading raster data on the DL EA. Integrate your raster data feeds into the DL EA to seamlessly intersect with our curated data products.
- Storage - DL Storage is a Python API for storing and accessing generic objects and data. Storage is commonly used for storing model weights or parameter files used in pipelines.
- Tasks - Tasks is a scalable backend, Python API, and web UI to deploy your models and data pipelines on. With Tasks you can run batch processing of your models and data pipelines over large AOIs and timeframes.
- Workbench - The Workbench is a DL-hosted JupyterLab instance where you can experiment and explore the DL Platform without having to set up your own environment to work in.
For more information about the Python API’s please view the full API documentation at https://docs.descarteslabs.com/api.html.
Status Page
If you are having intermittent issues with software functionality, check the Platform Status Page for the real-time status of known issues impacting Platform performance.
Documentation
Our documentation is always up-to-date with the latest version of our Platform and Python client, so it’s a good idea to get familiar with them. Our docs are broken up into several sections, in increasing level of detail:
- Tutorials: Tutorials are end-to-end solutions built on our software. These are adapted from our Applied Scientists’ research projects and are great places to start if you want to learn how our team uses DL software to solve huge problems.
- Examples: Examples demonstrate smaller pieces of functionality and are geared toward composability. We’re always adding to these examples, so let us know if there’s a topic you would like to see.
- Guides: Guides contain best practices and contain more context than Tutorials or Examples. If you’re wondering “How should I use Tasks?” (many of us do), you should check out the venerable Tasks Guide.
- API Reference: Lastly, our API reference is there when you need to dig into usage subtleties.
Support
-
Feel free to contact our Support Team at any time.
- Send an email to support@descarteslabs.com
- Use our online Service Desk portal
- Book an onboarding session