Skip to main content

Command Palette

Search for a command to run...

Ecowitt API

Updated
4 min read
Ecowitt API

Intro

This article will cover how to download data from your weather station using ecowitt API and Python. We will set up a docker environment and store data at MongoDB.

Last year I both a ProTech 7-in-1 Station with GW1100 to monitor weather conditions at my cabin. And for this project, I want to create a weather app so my father and grandpa can check on statics and real-time data. First of all, I didn't want to call the API each time I need data and second of all, Ecowitt doesn´t store data forever.

  • Data storage service on Ecowitt server: https://www.ecowitt.net

    • Stores data for past 3 months at 5-minute intervals

    • Stores data for past 1 year at 30-minute intervals

    • Stores data for past 2 year at 4-hours intervals

Let's start coding

To start you need API_KEY, APP_KEY and IEMI/MAC address for your device from ecowitt. And a docker environment for deployment.

First of all, I recommend storing your environment variables in a .env file for good practice. And then we can import them into Python

#.env
MONGO_HOST = mongodb://mongodb:27017/
MONGO_USERNAME = your_username
MONGO_PASSWORD = your_password

ECOWITT_APPLICATION_KEY = some_application_key
ECOWITT_API_KEY = some_api_key
ECOWITT_IMEI_MAC = some_imei_mac

In this project, I have chosen to use a document database such as MongoDB for the storage of objects. But they will be used later.

# handler.py
from decouple import config

ECOWITT_APPLICATION_KEY = config("ECOWITT_APPLICATION_KEY")
ECOWITT_API_KEY = config("ECOWITT_API_KEY")
ECOWITT_IMEI_MAC = config("ECOWITT_IMEI_MAC")
MONGO_USERNAME = urllib.parse.quote_plus(config("MONGO_USERNAME"))
MONGO_PASSWORD = urllib.parse.quote_plus(config("MONGO_PASSWORD"))

Now that we have imported our environment keys we will set up logging for our application if something went wrong we can read it through our log file.

# handler.py
import logging

logging.basicConfig(
    filename="handler.log", 
    filemode="a", 
    format="%(asctime)s - %(levelname)s - %(message)s", 
    datefmt="%d-%b-%y %H:%M:%S",
)
logger = logging.getLogger(__name__)
logger.setLevel(logging.INFO)

In this config, we will store our logs in a file "handler.log" where all our info will be stored. In a format of time - info level - and the message. In this application, we don't have any specific logging other than if data was inserted or something went wrong from the ecowitt API

Next up is the function that we will use to run every time we want updates. We will first connect to our database. Then create the request to ecowitt, and then store it in the database.

# handler.py
import pymongo
import requests
import datetime

def run_handler():
    # 
    client = pymongo.MongoClient(
    # protocol://username:password@(ip/docker_container)
    'mongodb://%s:$s@mongodb' % (MONGO_USERNAME, MONGO_PASSWORD)
    )
    # Our database name
    db = client["ecowitt"]

    # Ecowitt API
    url = f"https://api.ecowitt.net/api/v3/device/real_time?                    application_key={ECOWITT_APPLICATION_KEY}&api_key=            {ECOWITT_API_KEY}&mac={ECOWITT_IMEI_MAC}&call_back=all"

    # Call our ecowitt api
    response = requests.get(url)

    # Get current time
    now = datetime.datetime.now()

    # Convert to JSON
    data = response.json()

    # Insert data into mongodb
    db.ecowitt.insert_one(data)

    # log info to our handler.log
    logger.info(f"Data inserted at {now}")

With this, our python function is ready and we can start to set up a crontab. To create this cronjob we need to specify when the function is going to be executed, in my case I have chosen every minute. So now we can create a file named crontab.

# crontab
*/1 * * * * python3 /app/handler.py

Now we can start to create our Dockerfile. So to create this we need a file named Dockerfile.

# Dockerfile
FROM python:3.10.9-alpine

# Copy files
COPY crontab requirements.txt handler.py .env /app/

# Set working directory
WORKDIR /app

# Install dependencies
RUN pip install -r requirements.txt

# Install crontab
RUN crontab crontab

# Run crontab
CMD ["crond", "-f"]

Now we can use this in a docker-compose stack with our MongoDB database.

# docker-compose.yaml
version: '3'

# Creating a isolated network for comunications.
networks:
  ecowitt:
    driver: bridge

services:
    ecowitt:
      build: ./
      container_name: ecowitt
      image: python3.10.9-alpine
      restart: unless-stopped
      volumes:
        - appdata:/var/www
      depends_on:
        - mongodb
      networks:
        - ecowitt

    mongodb:
      # You can use the latest image
      image: mongo:4.4.6
      container_name: mongodb
      restart: unless-stopped
      command: mongod --auth
      environment:
        MONGO_INITDB_ROOT_USERNAME: your_username
        MONGO_INITDB_ROOT_PASSWORD: your_password
        MONGO_INITDB_DATABASE: ecowitt
        MONGODB_DATA_DIR: /data/db
        MONDODB_LOG_DIR: /dev/null
      # I have exposed ports to connect to mongodb compass
      ports:
        - "27017:27017"
      volumes:
        - mongodbdata:/data/db
      networks:
        - ecowitt

# Presistent volumes so data doesn't get deleted on reboot
volumes:
  mongodbdata:
  appdata:
    driver: local

Now when everything is ready we can use "docker-compose up -d" to deploy our application to docker. And then you have a function that collects data from your device and stores it locally on your device. If you want to monitor cronjobs and want to step it a little bit further you can checkout Cronitor to get emails if your function did not run and check statuses.