Compare commits
1 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
6761f9dfdd |
47
.github/ISSUE_TEMPLATE/bug_report.md
vendored
Normal file
47
.github/ISSUE_TEMPLATE/bug_report.md
vendored
Normal file
@@ -0,0 +1,47 @@
|
||||
---
|
||||
name: Bug report
|
||||
about: Create a report to help us improve
|
||||
|
||||
---
|
||||
|
||||
<!-- Thanks for reporting a bug for this project. READ THIS FIRST:
|
||||
- Please make sure to submit issues in the right GitHub repository, if unsure just post it here:
|
||||
- esphomeyaml [here] - This is mostly for reporting bugs when compiling and when you get a long stack trace while compiling or if a configuration fails to validate.
|
||||
- esphomelib [https://github.com/OttoWinter/esphomelib] - Report bugs there if the ESP is crashing or a feature is not working as expected.
|
||||
- esphomedocs [https://github.com/OttoWinter/esphomedocs] - Report bugs there if the documentation is wrong/outdated.
|
||||
- Provide as many details as possible. Paste logs, configuration sample and code into the backticks (```). Do not delete any text from this template!
|
||||
-->
|
||||
|
||||
**Operating environment (Hass.io/Docker/pip/etc.):**
|
||||
<!--
|
||||
Please provide details about your environment.
|
||||
-->
|
||||
|
||||
**ESP (ESP32/ESP8266/Board/Sonoff):**
|
||||
<!--
|
||||
Please provide details about which ESP you're using.
|
||||
-->
|
||||
|
||||
**Affected component:**
|
||||
<!--
|
||||
Please add the link to the documentation at https://esphomelib.com/esphomeyaml/index.html of the component in question.
|
||||
-->
|
||||
|
||||
|
||||
**Description of problem:**
|
||||
|
||||
|
||||
**Problem-relevant YAML-configuration entries:**
|
||||
```yaml
|
||||
|
||||
```
|
||||
|
||||
**Traceback (if applicable):**
|
||||
<!--
|
||||
Please copy the traceback here if compilation is failing. If possible, also connect to the ESP and copy its logs into the backticks.
|
||||
-->
|
||||
```
|
||||
|
||||
```
|
||||
|
||||
**Additional information:**
|
||||
22
.github/ISSUE_TEMPLATE/feature_request.md
vendored
Normal file
22
.github/ISSUE_TEMPLATE/feature_request.md
vendored
Normal file
@@ -0,0 +1,22 @@
|
||||
---
|
||||
name: Feature request
|
||||
about: Suggest an idea for this project
|
||||
|
||||
---
|
||||
|
||||
<!-- READ THIS FIRST:
|
||||
- This is for feature requests only, if you want to have a certain new sensor/module supported, please use the "new integration" template.
|
||||
- Please be as descriptive as possible, especially use-cases that can otherwise not be solved boost the problem's priority.
|
||||
-->
|
||||
|
||||
**Is your feature request related to a problem? Please describe.**
|
||||
<!--
|
||||
A clear and concise description of what the problem is.
|
||||
-->
|
||||
Ex. I'm always frustrated when [...]
|
||||
|
||||
**Describe the solution you'd like**
|
||||
A description of what you want to happen.
|
||||
|
||||
**Additional context**
|
||||
Add any other context about the feature request here.
|
||||
20
.github/ISSUE_TEMPLATE/new-integration.md
vendored
Normal file
20
.github/ISSUE_TEMPLATE/new-integration.md
vendored
Normal file
@@ -0,0 +1,20 @@
|
||||
---
|
||||
name: New integration
|
||||
about: Suggest a new integration for esphomelib
|
||||
|
||||
---
|
||||
|
||||
<!-- READ THIS FIRST:
|
||||
- This is for new integrations (such as new sensors/modules) only, for new features within the environment please use the "feature request" template.
|
||||
- Do not delete anything from this template and fill out the form as precisely as possible.
|
||||
-->
|
||||
|
||||
**What new integration would you wish to have?**
|
||||
<!-- A name/description of the new integration/board. -->
|
||||
|
||||
**If possible, provide a link to an existing library for the integration:**
|
||||
|
||||
**Is your feature request related to a problem? Please describe.**
|
||||
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
|
||||
|
||||
**Additional context**
|
||||
10
.github/PULL_REQUEST_TEMPLATE.md
vendored
10
.github/PULL_REQUEST_TEMPLATE.md
vendored
@@ -3,12 +3,18 @@
|
||||
|
||||
**Related issue (if applicable):** fixes <link to issue>
|
||||
|
||||
**Pull request in [esphome-docs](https://github.com/esphome/esphome-docs) with documentation (if applicable):** esphome/esphome-docs#<esphome-docs PR number goes here>
|
||||
**Pull request in [esphome-core](https://github.com/esphome/esphome-core) with C++ framework changes (if applicable):** esphome/esphome-core#<esphome-core PR number goes here>
|
||||
**Pull request in [esphomedocs](https://github.com/OttoWinter/esphomedocs) with documentation (if applicable):** OttoWinter/esphomedocs#<esphomedocs PR number goes here>
|
||||
**Pull request in [esphomelib](https://github.com/OttoWinter/esphomelib) with C++ framework changes (if applicable):** OttoWinter/esphomelib#<esphomelib PR number goes here>
|
||||
|
||||
## Example entry for YAML configuration (if applicable):
|
||||
```yaml
|
||||
|
||||
```
|
||||
|
||||
## Checklist:
|
||||
- [ ] The code change is tested and works locally.
|
||||
- [ ] Tests have been added to verify that the new code works (under `tests/` folder).
|
||||
- [ ] Check this box if you have read, understand, comply, and agree with the [Code of Conduct](https://github.com/OttoWinter/esphomeyaml/blob/master/CODE_OF_CONDUCT.md).
|
||||
|
||||
If user exposed functionality or configuration variables are added/changed:
|
||||
- [ ] Documentation added/updated in [esphomedocs](https://github.com/OttoWinter/esphomedocs).
|
||||
|
||||
7
.github/issue-close-app.yml
vendored
7
.github/issue-close-app.yml
vendored
@@ -1,7 +0,0 @@
|
||||
comment: >-
|
||||
https://github.com/esphome/esphome/issues/430
|
||||
issueConfigs:
|
||||
- content:
|
||||
- "OTHERWISE THE ISSUE WILL BE CLOSED AUTOMATICALLY"
|
||||
|
||||
caseInsensitive: false
|
||||
1
.gitignore
vendored
1
.gitignore
vendored
@@ -105,4 +105,3 @@ venv.bak/
|
||||
|
||||
config/
|
||||
tests/build/
|
||||
tests/.esphome/
|
||||
|
||||
529
.gitlab-ci.yml
529
.gitlab-ci.yml
@@ -2,126 +2,299 @@
|
||||
# Based on https://gitlab.com/hassio-addons/addon-node-red/blob/master/.gitlab-ci.yml
|
||||
variables:
|
||||
DOCKER_DRIVER: overlay2
|
||||
DOCKER_HOST: tcp://docker:2375/
|
||||
|
||||
stages:
|
||||
- lint
|
||||
- test
|
||||
- build
|
||||
- deploy
|
||||
|
||||
.lint: &lint
|
||||
image: esphome/esphome-base-amd64
|
||||
stage: lint
|
||||
before_script:
|
||||
- pip install -e .
|
||||
- pip install flake8==3.6.0 pylint==1.9.4 pillow
|
||||
tags:
|
||||
- docker
|
||||
- python2.7
|
||||
- esphomeyaml-lint
|
||||
|
||||
.test: &test
|
||||
image: esphome/esphome-base-amd64
|
||||
stage: test
|
||||
before_script:
|
||||
- pip install -e .
|
||||
tags:
|
||||
- docker
|
||||
- python2.7
|
||||
- esphomeyaml-test
|
||||
variables:
|
||||
TZ: UTC
|
||||
cache:
|
||||
paths:
|
||||
- tests/build
|
||||
|
||||
.docker-base: &docker-base
|
||||
image: esphome/esphome-base-builder
|
||||
.docker-builder: &docker-builder
|
||||
before_script:
|
||||
- docker info
|
||||
- docker login -u "$DOCKER_USER" -p "$DOCKER_PASSWORD"
|
||||
script:
|
||||
- docker run --rm --privileged hassioaddons/qemu-user-static:latest
|
||||
- TAG="${CI_COMMIT_TAG#v}"
|
||||
- TAG="${TAG:-${CI_COMMIT_SHA:0:7}}"
|
||||
- echo "Tag ${TAG}"
|
||||
|
||||
- |
|
||||
if [[ "${IS_HASSIO}" == "YES" ]]; then
|
||||
BUILD_FROM=esphome/esphome-hassio-base-${BUILD_ARCH}:1.3.0
|
||||
BUILD_TO=esphome/esphome-hassio-${BUILD_ARCH}
|
||||
DOCKERFILE=docker/Dockerfile.hassio
|
||||
else
|
||||
BUILD_FROM=esphome/esphome-base-${BUILD_ARCH}:1.3.0
|
||||
if [[ "${BUILD_ARCH}" == "amd64" ]]; then
|
||||
BUILD_TO=esphome/esphome
|
||||
else
|
||||
BUILD_TO=esphome/esphome-${BUILD_ARCH}
|
||||
fi
|
||||
DOCKERFILE=docker/Dockerfile
|
||||
fi
|
||||
|
||||
- |
|
||||
docker build \
|
||||
--build-arg "BUILD_FROM=${BUILD_FROM}" \
|
||||
--build-arg "BUILD_VERSION=${TAG}" \
|
||||
--tag "${BUILD_TO}:${TAG}" \
|
||||
--file "${DOCKERFILE}" \
|
||||
.
|
||||
- |
|
||||
if [[ "${RELEASE}" = "YES" ]]; then
|
||||
echo "Pushing to ${BUILD_TO}:${TAG}"
|
||||
docker push "${BUILD_TO}:${TAG}"
|
||||
fi
|
||||
- |
|
||||
if [[ "${LATEST}" = "YES" ]]; then
|
||||
echo "Pushing to :latest"
|
||||
docker tag ${BUILD_TO}:${TAG} ${BUILD_TO}:latest
|
||||
docker push ${BUILD_TO}:latest
|
||||
fi
|
||||
- |
|
||||
if [[ "${BETA}" = "YES" ]]; then
|
||||
echo "Pushing to :beta"
|
||||
docker tag \
|
||||
${BUILD_TO}:${TAG} \
|
||||
${BUILD_TO}:beta
|
||||
docker push ${BUILD_TO}:beta
|
||||
fi
|
||||
- |
|
||||
if [[ "${DEV}" = "YES" ]]; then
|
||||
echo "Pushing to :dev"
|
||||
docker tag \
|
||||
${BUILD_TO}:${TAG} \
|
||||
${BUILD_TO}:dev
|
||||
docker push ${BUILD_TO}:dev
|
||||
fi
|
||||
- docker login -u "$CI_REGISTRY_USER" -p "$CI_REGISTRY_PASSWORD" "$CI_REGISTRY"
|
||||
services:
|
||||
- docker:dind
|
||||
tags:
|
||||
- docker
|
||||
stage: deploy
|
||||
- hassio-builder
|
||||
|
||||
flake8:
|
||||
<<: *lint
|
||||
script:
|
||||
- flake8 esphome
|
||||
- flake8 esphomeyaml
|
||||
|
||||
pylint:
|
||||
<<: *lint
|
||||
script:
|
||||
- pylint esphome
|
||||
- pylint esphomeyaml
|
||||
|
||||
test1:
|
||||
<<: *test
|
||||
script:
|
||||
- esphome tests/test1.yaml compile
|
||||
- esphomeyaml tests/test1.yaml compile
|
||||
|
||||
test2:
|
||||
<<: *test
|
||||
script:
|
||||
- esphome tests/test2.yaml compile
|
||||
- esphomeyaml tests/test2.yaml compile
|
||||
|
||||
test3:
|
||||
<<: *test
|
||||
.build-hassio: &build-hassio
|
||||
<<: *docker-builder
|
||||
stage: build
|
||||
script:
|
||||
- esphome tests/test3.yaml compile
|
||||
- docker run --rm --privileged hassioaddons/qemu-user-static:latest
|
||||
- BUILD_FROM=homeassistant/${ADDON_ARCH}-base-ubuntu:latest
|
||||
- ADDON_VERSION="${CI_COMMIT_TAG#v}"
|
||||
- ADDON_VERSION="${ADDON_VERSION:-${CI_COMMIT_SHA:0:7}}"
|
||||
- ESPHOMELIB_VERSION="${ESPHOMELIB_VERSION:-dev}"
|
||||
- echo "Build from ${BUILD_FROM}"
|
||||
- echo "Add-on version ${ADDON_VERSION}"
|
||||
- echo "Esphomelib version ${ESPHOMELIB_VERSION}"
|
||||
- echo "Tag ${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:dev"
|
||||
- echo "Tag ${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}"
|
||||
- |
|
||||
docker build \
|
||||
--build-arg "BUILD_FROM=${BUILD_FROM}" \
|
||||
--build-arg "ADDON_ARCH=${ADDON_ARCH}" \
|
||||
--build-arg "ADDON_VERSION=${ADDON_VERSION}" \
|
||||
--build-arg "ESPHOMELIB_VERSION=${ESPHOMELIB_VERSION}" \
|
||||
--tag "${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:dev" \
|
||||
--tag "${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}" \
|
||||
--file "docker/Dockerfile.hassio" \
|
||||
.
|
||||
- |
|
||||
if [ "${DO_PUSH:-true}" = true ]; then
|
||||
echo "Pushing to CI registry"
|
||||
docker push ${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}
|
||||
docker push ${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:dev
|
||||
fi
|
||||
|
||||
# Generic deploy template
|
||||
.deploy-release: &deploy-release
|
||||
<<: *docker-builder
|
||||
stage: deploy
|
||||
script:
|
||||
- version="${CI_COMMIT_TAG#v}"
|
||||
- echo "Publishing release version ${version}"
|
||||
- docker pull "${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}"
|
||||
- docker login -u "$DOCKER_USER" -p "$DOCKER_PASSWORD"
|
||||
|
||||
- echo "Tag ${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
|
||||
- |
|
||||
docker tag \
|
||||
"${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}" \
|
||||
"${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
|
||||
- docker push "${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
|
||||
|
||||
- echo "Tag ${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:latest"
|
||||
- |
|
||||
docker tag \
|
||||
"${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}" \
|
||||
"${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:latest"
|
||||
- docker push "${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:latest"
|
||||
|
||||
- echo "Tag ${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:rc"
|
||||
- |
|
||||
docker tag \
|
||||
"${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}" \
|
||||
"${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:rc"
|
||||
- docker push "${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:rc"
|
||||
|
||||
- echo "Tag ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
|
||||
- |
|
||||
docker tag \
|
||||
"${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}" \
|
||||
"ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
|
||||
- docker push "ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
|
||||
|
||||
- echo "Tag ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:latest"
|
||||
- |
|
||||
docker tag \
|
||||
"${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}" \
|
||||
"ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:latest"
|
||||
- docker push "ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:latest"
|
||||
|
||||
- echo "Tag ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:rc"
|
||||
- |
|
||||
docker tag \
|
||||
"${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}" \
|
||||
"ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:rc"
|
||||
- docker push "ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:rc"
|
||||
only:
|
||||
- /^v\d+\.\d+\.\d+$/
|
||||
except:
|
||||
- /^(?!master).+@/
|
||||
|
||||
.deploy-beta: &deploy-beta
|
||||
<<: *docker-builder
|
||||
stage: deploy
|
||||
script:
|
||||
- version="${CI_COMMIT_TAG#v}"
|
||||
- echo "Publishing beta version ${version}"
|
||||
- docker pull "${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}"
|
||||
- docker login -u "$DOCKER_USER" -p "$DOCKER_PASSWORD"
|
||||
|
||||
- echo "Tag ${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
|
||||
- |
|
||||
docker tag \
|
||||
"${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}" \
|
||||
"${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
|
||||
- docker push "${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
|
||||
|
||||
- echo "Tag ${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:rc"
|
||||
- |
|
||||
docker tag \
|
||||
"${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}" \
|
||||
"${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:rc"
|
||||
- docker push "${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:rc"
|
||||
|
||||
- echo "Tag ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
|
||||
- |
|
||||
docker tag \
|
||||
"${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}" \
|
||||
"ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
|
||||
- docker push "ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
|
||||
|
||||
- echo "Tag ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:rc"
|
||||
- |
|
||||
docker tag \
|
||||
"${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}" \
|
||||
"ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:rc"
|
||||
- docker push "ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:rc"
|
||||
only:
|
||||
- /^v\d+\.\d+\.\d+b\d+$/
|
||||
except:
|
||||
- /^(?!rc).+@/
|
||||
|
||||
# Build jobs
|
||||
build:normal:
|
||||
<<: *docker-builder
|
||||
stage: build
|
||||
script:
|
||||
- docker build -t "${CI_REGISTRY}/esphomeyaml:dev" .
|
||||
|
||||
.build-hassio-edge: &build-hassio-edge
|
||||
<<: *build-hassio
|
||||
except:
|
||||
- /^v\d+\.\d+\.\d+$/
|
||||
- /^v\d+\.\d+\.\d+b\d+$/
|
||||
|
||||
.build-hassio-release: &build-hassio-release
|
||||
<<: *build-hassio
|
||||
only:
|
||||
- /^v\d+\.\d+\.\d+$/
|
||||
- /^v\d+\.\d+\.\d+b\d+$/
|
||||
|
||||
build:hassio-armhf-edge:
|
||||
<<: *build-hassio-edge
|
||||
variables:
|
||||
ADDON_ARCH: armhf
|
||||
DO_PUSH: "false"
|
||||
|
||||
build:hassio-armhf:
|
||||
<<: *build-hassio-release
|
||||
variables:
|
||||
ADDON_ARCH: armhf
|
||||
ESPHOMELIB_VERSION: "${CI_COMMIT_TAG}"
|
||||
|
||||
#build:hassio-aarch64-edge:
|
||||
# <<: *build-hassio-edge
|
||||
# variables:
|
||||
# ADDON_ARCH: aarch64
|
||||
# DO_PUSH: "false"
|
||||
|
||||
#build:hassio-aarch64:
|
||||
# <<: *build-hassio-release
|
||||
# variables:
|
||||
# ADDON_ARCH: aarch64
|
||||
# ESPHOMELIB_VERSION: "${CI_COMMIT_TAG}"
|
||||
|
||||
build:hassio-i386-edge:
|
||||
<<: *build-hassio-edge
|
||||
variables:
|
||||
ADDON_ARCH: i386
|
||||
DO_PUSH: "false"
|
||||
|
||||
build:hassio-i386:
|
||||
<<: *build-hassio-release
|
||||
variables:
|
||||
ADDON_ARCH: i386
|
||||
ESPHOMELIB_VERSION: "${CI_COMMIT_TAG}"
|
||||
|
||||
build:hassio-amd64-edge:
|
||||
<<: *build-hassio-edge
|
||||
variables:
|
||||
ADDON_ARCH: amd64
|
||||
DO_PUSH: "false"
|
||||
|
||||
build:hassio-amd64:
|
||||
<<: *build-hassio-release
|
||||
variables:
|
||||
ADDON_ARCH: amd64
|
||||
ESPHOMELIB_VERSION: "${CI_COMMIT_TAG}"
|
||||
|
||||
# Deploy jobs
|
||||
deploy-release:armhf:
|
||||
<<: *deploy-release
|
||||
variables:
|
||||
ADDON_ARCH: armhf
|
||||
|
||||
deploy-beta:armhf:
|
||||
<<: *deploy-beta
|
||||
variables:
|
||||
ADDON_ARCH: armhf
|
||||
|
||||
#deploy-release:aarch64:
|
||||
# <<: *deploy-release
|
||||
# variables:
|
||||
# ADDON_ARCH: aarch64
|
||||
#
|
||||
#deploy-beta:aarch64:
|
||||
# <<: *deploy-beta
|
||||
# variables:
|
||||
# ADDON_ARCH: aarch64
|
||||
|
||||
deploy-release:i386:
|
||||
<<: *deploy-release
|
||||
variables:
|
||||
ADDON_ARCH: i386
|
||||
|
||||
deploy-beta:i386:
|
||||
<<: *deploy-beta
|
||||
variables:
|
||||
ADDON_ARCH: i386
|
||||
|
||||
deploy-release:amd64:
|
||||
<<: *deploy-release
|
||||
variables:
|
||||
ADDON_ARCH: amd64
|
||||
|
||||
deploy-beta:amd64:
|
||||
<<: *deploy-beta
|
||||
variables:
|
||||
ADDON_ARCH: amd64
|
||||
|
||||
.deploy-pypi: &deploy-pypi
|
||||
stage: deploy
|
||||
image: python:2.7
|
||||
before_script:
|
||||
- pip install -e .
|
||||
- pip install twine
|
||||
@@ -129,7 +302,8 @@ test3:
|
||||
- python setup.py sdist
|
||||
- twine upload dist/*
|
||||
tags:
|
||||
- docker
|
||||
- python2.7
|
||||
- esphomeyaml-test
|
||||
|
||||
deploy-release:pypi:
|
||||
<<: *deploy-pypi
|
||||
@@ -144,204 +318,3 @@ deploy-beta:pypi:
|
||||
- /^v\d+\.\d+\.\d+b\d+$/
|
||||
except:
|
||||
- /^(?!rc).+@/
|
||||
|
||||
.latest: &latest
|
||||
<<: *docker-base
|
||||
only:
|
||||
- /^v([0-9\.]+)$/
|
||||
except:
|
||||
- branches
|
||||
|
||||
.latest-vars: &latest-vars
|
||||
RELEASE: YES
|
||||
LATEST: YES
|
||||
# Also push to beta tag
|
||||
BETA: YES
|
||||
|
||||
.beta: &beta
|
||||
<<: *docker-base
|
||||
only:
|
||||
- /^v([0-9\.]+b\d+)$/
|
||||
except:
|
||||
- branches
|
||||
|
||||
.beta-vars: &beta-vars
|
||||
RELEASE: YES
|
||||
BETA: YES
|
||||
|
||||
.dev: &dev
|
||||
<<: *docker-base
|
||||
only:
|
||||
- dev
|
||||
|
||||
.dev-vars: &dev-vars
|
||||
DEV: YES
|
||||
|
||||
#aarch64-beta-docker:
|
||||
# <<: *beta
|
||||
# variables:
|
||||
# BETA: "YES"
|
||||
# BUILD_ARCH: aarch64
|
||||
# IS_HASSIO: "NO"
|
||||
# RELEASE: "YES"
|
||||
#aarch64-beta-hassio:
|
||||
# <<: *beta
|
||||
# variables:
|
||||
# BETA: "YES"
|
||||
# BUILD_ARCH: aarch64
|
||||
# IS_HASSIO: "YES"
|
||||
# RELEASE: "YES"
|
||||
#aarch64-dev-docker:
|
||||
# <<: *dev
|
||||
# variables:
|
||||
# BUILD_ARCH: aarch64
|
||||
# DEV: "YES"
|
||||
# IS_HASSIO: "NO"
|
||||
#aarch64-dev-hassio:
|
||||
# <<: *dev
|
||||
# variables:
|
||||
# BUILD_ARCH: aarch64
|
||||
# DEV: "YES"
|
||||
# IS_HASSIO: "YES"
|
||||
#aarch64-latest-docker:
|
||||
# <<: *latest
|
||||
# variables:
|
||||
# BETA: "YES"
|
||||
# BUILD_ARCH: aarch64
|
||||
# IS_HASSIO: "NO"
|
||||
# LATEST: "YES"
|
||||
# RELEASE: "YES"
|
||||
#aarch64-latest-hassio:
|
||||
# <<: *latest
|
||||
# variables:
|
||||
# BETA: "YES"
|
||||
# BUILD_ARCH: aarch64
|
||||
# IS_HASSIO: "YES"
|
||||
# LATEST: "YES"
|
||||
# RELEASE: "YES"
|
||||
amd64-beta-docker:
|
||||
<<: *beta
|
||||
variables:
|
||||
BETA: "YES"
|
||||
BUILD_ARCH: amd64
|
||||
IS_HASSIO: "NO"
|
||||
RELEASE: "YES"
|
||||
amd64-beta-hassio:
|
||||
<<: *beta
|
||||
variables:
|
||||
BETA: "YES"
|
||||
BUILD_ARCH: amd64
|
||||
IS_HASSIO: "YES"
|
||||
RELEASE: "YES"
|
||||
amd64-dev-docker:
|
||||
<<: *dev
|
||||
variables:
|
||||
BUILD_ARCH: amd64
|
||||
DEV: "YES"
|
||||
IS_HASSIO: "NO"
|
||||
amd64-dev-hassio:
|
||||
<<: *dev
|
||||
variables:
|
||||
BUILD_ARCH: amd64
|
||||
DEV: "YES"
|
||||
IS_HASSIO: "YES"
|
||||
amd64-latest-docker:
|
||||
<<: *latest
|
||||
variables:
|
||||
BETA: "YES"
|
||||
BUILD_ARCH: amd64
|
||||
IS_HASSIO: "NO"
|
||||
LATEST: "YES"
|
||||
RELEASE: "YES"
|
||||
amd64-latest-hassio:
|
||||
<<: *latest
|
||||
variables:
|
||||
BETA: "YES"
|
||||
BUILD_ARCH: amd64
|
||||
IS_HASSIO: "YES"
|
||||
LATEST: "YES"
|
||||
RELEASE: "YES"
|
||||
armhf-beta-docker:
|
||||
<<: *beta
|
||||
variables:
|
||||
BETA: "YES"
|
||||
BUILD_ARCH: armhf
|
||||
IS_HASSIO: "NO"
|
||||
RELEASE: "YES"
|
||||
armhf-beta-hassio:
|
||||
<<: *beta
|
||||
variables:
|
||||
BETA: "YES"
|
||||
BUILD_ARCH: armhf
|
||||
IS_HASSIO: "YES"
|
||||
RELEASE: "YES"
|
||||
armhf-dev-docker:
|
||||
<<: *dev
|
||||
variables:
|
||||
BUILD_ARCH: armhf
|
||||
DEV: "YES"
|
||||
IS_HASSIO: "NO"
|
||||
armhf-dev-hassio:
|
||||
<<: *dev
|
||||
variables:
|
||||
BUILD_ARCH: armhf
|
||||
DEV: "YES"
|
||||
IS_HASSIO: "YES"
|
||||
armhf-latest-docker:
|
||||
<<: *latest
|
||||
variables:
|
||||
BETA: "YES"
|
||||
BUILD_ARCH: armhf
|
||||
IS_HASSIO: "NO"
|
||||
LATEST: "YES"
|
||||
RELEASE: "YES"
|
||||
armhf-latest-hassio:
|
||||
<<: *latest
|
||||
variables:
|
||||
BETA: "YES"
|
||||
BUILD_ARCH: armhf
|
||||
IS_HASSIO: "YES"
|
||||
LATEST: "YES"
|
||||
RELEASE: "YES"
|
||||
i386-beta-docker:
|
||||
<<: *beta
|
||||
variables:
|
||||
BETA: "YES"
|
||||
BUILD_ARCH: i386
|
||||
IS_HASSIO: "NO"
|
||||
RELEASE: "YES"
|
||||
i386-beta-hassio:
|
||||
<<: *beta
|
||||
variables:
|
||||
BETA: "YES"
|
||||
BUILD_ARCH: i386
|
||||
IS_HASSIO: "YES"
|
||||
RELEASE: "YES"
|
||||
i386-dev-docker:
|
||||
<<: *dev
|
||||
variables:
|
||||
BUILD_ARCH: i386
|
||||
DEV: "YES"
|
||||
IS_HASSIO: "NO"
|
||||
i386-dev-hassio:
|
||||
<<: *dev
|
||||
variables:
|
||||
BUILD_ARCH: i386
|
||||
DEV: "YES"
|
||||
IS_HASSIO: "YES"
|
||||
i386-latest-docker:
|
||||
<<: *latest
|
||||
variables:
|
||||
BETA: "YES"
|
||||
BUILD_ARCH: i386
|
||||
IS_HASSIO: "NO"
|
||||
LATEST: "YES"
|
||||
RELEASE: "YES"
|
||||
i386-latest-hassio:
|
||||
<<: *latest
|
||||
variables:
|
||||
BETA: "YES"
|
||||
BUILD_ARCH: i386
|
||||
IS_HASSIO: "YES"
|
||||
LATEST: "YES"
|
||||
RELEASE: "YES"
|
||||
|
||||
48
.travis.yml
48
.travis.yml
@@ -1,38 +1,20 @@
|
||||
sudo: false
|
||||
language: python
|
||||
|
||||
cache:
|
||||
directories:
|
||||
- "~/.platformio"
|
||||
- "$TRAVIS_BUILD_DIR/tests/build/test1/.piolibdeps"
|
||||
- "$TRAVIS_BUILD_DIR/tests/build/test2/.piolibdeps"
|
||||
- "$TRAVIS_BUILD_DIR/tests/build/test3/.piolibdeps"
|
||||
|
||||
matrix:
|
||||
fast_finish: true
|
||||
python:
|
||||
- "2.7"
|
||||
jobs:
|
||||
include:
|
||||
- python: "2.7"
|
||||
env: TARGET=Lint2.7
|
||||
install: pip install -e . && pip install flake8==3.6.0 pylint==1.9.4 pillow
|
||||
- name: "Lint"
|
||||
install:
|
||||
- pip install -r requirements.txt
|
||||
- pip install flake8==3.5.0 pylint==1.9.3 tzlocal pillow
|
||||
script:
|
||||
- flake8 esphome
|
||||
- pylint esphome
|
||||
- python: "3.5.3"
|
||||
env: TARGET=Lint3.5
|
||||
install: pip install -U https://github.com/platformio/platformio-core/archive/develop.zip && pip install -e . && pip install flake8==3.6.0 pylint==2.3.0 pillow
|
||||
- flake8 esphomeyaml
|
||||
- pylint esphomeyaml
|
||||
- name: "Test"
|
||||
install:
|
||||
- pip install -e .
|
||||
- pip install tzlocal pillow
|
||||
script:
|
||||
- flake8 esphome
|
||||
- pylint esphome
|
||||
- python: "2.7"
|
||||
env: TARGET=Test2.7
|
||||
install: pip install -e . && pip install flake8==3.6.0 pylint==1.9.4 pillow
|
||||
script:
|
||||
- esphome tests/test1.yaml compile
|
||||
- esphome tests/test2.yaml compile
|
||||
- esphome tests/test3.yaml compile
|
||||
#- python: "3.5.3"
|
||||
# env: TARGET=Test3.5
|
||||
# install: pip install -U https://github.com/platformio/platformio-core/archive/develop.zip && pip install -e . && pip install flake8==3.6.0 pylint==2.3.0 pillow
|
||||
# script:
|
||||
# - esphome tests/test1.yaml compile
|
||||
# - esphome tests/test2.yaml compile
|
||||
- esphomeyaml tests/test1.yaml compile
|
||||
- esphomeyaml tests/test2.yaml compile
|
||||
|
||||
@@ -1,16 +1,16 @@
|
||||
# Contributing to ESPHome
|
||||
# Contributing to esphomeyaml
|
||||
|
||||
This python project is responsible for reading in YAML configuration files,
|
||||
esphomeyaml is a part of esphomelib and is responsible for reading in YAML configuration files,
|
||||
converting them to C++ code. This code is then converted to a platformio project and compiled
|
||||
with [esphome-core](https://github.com/esphome/esphome-core), the C++ framework behind the project.
|
||||
with [esphomelib](https://github.com/OttoWinter/esphomelib), the C++ framework behind the project.
|
||||
|
||||
For a detailed guide, please see https://esphome.io/guides/contributing.html#contributing-to-esphomeyaml
|
||||
For a detailed guide, please see https://esphomelib.com/esphomeyaml/guides/contributing.html#contributing-to-esphomeyaml
|
||||
|
||||
Things to note when contributing:
|
||||
|
||||
- Please test your changes :)
|
||||
- If a new feature is added or an existing user-facing feature is changed, you should also
|
||||
update the [docs](https://github.com/esphome/esphome-docs). See [contributing to esphome-docs](https://esphome.io/guides/contributing.html#contributing-to-esphomedocs)
|
||||
update the [docs](https://github.com/OttoWinter/esphomedocs). See [contributing to esphomedocs](https://esphomelib.com/esphomeyaml/guides/contributing.html#contributing-to-esphomedocs)
|
||||
for more information.
|
||||
- Please also update the tests in the `tests/` folder. You can do so by just adding a line in one of the YAML files
|
||||
which checks if your new feature compiles correctly.
|
||||
|
||||
29
Dockerfile
Normal file
29
Dockerfile
Normal file
@@ -0,0 +1,29 @@
|
||||
ARG BUILD_FROM=python:2.7
|
||||
FROM ${BUILD_FROM}
|
||||
MAINTAINER Otto Winter <contact@otto-winter.com>
|
||||
|
||||
RUN apt-get update && apt-get install -y \
|
||||
python-pil \
|
||||
git \
|
||||
&& apt-get clean && rm -rf /var/lib/apt/lists/* /tmp/* && \
|
||||
pip install --no-cache-dir --no-binary :all: platformio && \
|
||||
platformio settings set enable_telemetry No && \
|
||||
platformio settings set check_libraries_interval 1000000 && \
|
||||
platformio settings set check_platformio_interval 1000000 && \
|
||||
platformio settings set check_platforms_interval 1000000
|
||||
|
||||
ENV ESPHOMEYAML_OTA_HOST_PORT=6123
|
||||
EXPOSE 6123
|
||||
VOLUME /config
|
||||
WORKDIR /usr/src/app
|
||||
|
||||
COPY docker/platformio.ini /pio/platformio.ini
|
||||
RUN platformio run -d /pio; rm -rf /pio
|
||||
|
||||
COPY . .
|
||||
RUN pip install --no-cache-dir --no-binary :all: -e . && \
|
||||
pip install --no-cache-dir --no-binary :all: tzlocal
|
||||
|
||||
WORKDIR /config
|
||||
ENTRYPOINT ["esphomeyaml"]
|
||||
CMD ["/config", "dashboard"]
|
||||
19
MANIFEST.in
19
MANIFEST.in
@@ -1,17 +1,4 @@
|
||||
include README.md
|
||||
include esphome/dashboard/templates/index.html
|
||||
include esphome/dashboard/templates/login.html
|
||||
include esphome/dashboard/static/ace.js
|
||||
include esphome/dashboard/static/esphome.css
|
||||
include esphome/dashboard/static/esphome.js
|
||||
include esphome/dashboard/static/favicon.ico
|
||||
include esphome/dashboard/static/jquery.min.js
|
||||
include esphome/dashboard/static/jquery.validate.min.js
|
||||
include esphome/dashboard/static/jquery-ui.min.js
|
||||
include esphome/dashboard/static/materialize.min.css
|
||||
include esphome/dashboard/static/materialize.min.js
|
||||
include esphome/dashboard/static/materialize-stepper.min.css
|
||||
include esphome/dashboard/static/materialize-stepper.min.js
|
||||
include esphome/dashboard/static/mode-yaml.js
|
||||
include esphome/dashboard/static/theme-dreamweaver.js
|
||||
include esphome/dashboard/static/ext-searchbox.js
|
||||
include esphomeyaml/dashboard/templates/index.html
|
||||
include esphomeyaml/dashboard/static/materialize-stepper.min.css
|
||||
include esphomeyaml/dashboard/static/materialize-stepper.min.js
|
||||
|
||||
39
README.md
39
README.md
@@ -1,9 +1,38 @@
|
||||
# ESPHome [](https://travis-ci.org/esphome/esphome) [](https://discord.gg/KhAMKrd) [](https://GitHub.com/esphome/esphome/releases/)
|
||||
# esphomeyaml for [esphomelib](https://github.com/OttoWinter/esphomelib)
|
||||
|
||||
[](https://esphome.io/)
|
||||
### Getting Started Guide: https://esphomelib.com/esphomeyaml/guides/getting_started_command_line.html
|
||||
|
||||
**Documentation:** https://esphome.io/
|
||||
### Available Components: https://esphomelib.com/esphomeyaml/index.html
|
||||
|
||||
For issues, please go to [the issue tracker](https://github.com/esphome/issues/issues).
|
||||
esphomeyaml is the solution for your ESP8266/ESP32 projects with Home Assistant. It allows you to create **custom firmwares** for your microcontrollers with no programming experience required. All you need to know is the YAML configuration format which is also used by [Home Assistant](https://www.home-assistant.io).
|
||||
|
||||
For feature requests, please see [feature requests](https://github.com/esphome/feature-requests/issues).
|
||||
esphomeyaml will:
|
||||
|
||||
* Read your configuration file and warn you about potential errors (like using the invalid pins.)
|
||||
* Create a custom C++ sketch file for you using esphomeyaml's powerful C++ generation engine.
|
||||
* Compile the sketch file for you using [platformio](http://platformio.org/).
|
||||
* Upload the binary to your ESP via Over the Air updates.
|
||||
* Automatically start remote logs via MQTT.
|
||||
|
||||
And all of that with a single command 🎉:
|
||||
|
||||
```bash
|
||||
esphomeyaml configuration.yaml run
|
||||
```
|
||||
|
||||
## Features
|
||||
|
||||
* **No programming experience required:** just edit YAML configuration
|
||||
files like you're used to with Home Assistant.
|
||||
* **Flexible:** Use [esphomelib](https://github.com/OttoWinter/esphomelib)'s powerful core to create custom sensors/outputs.
|
||||
* **Fast and efficient:** Written in C++ and keeps memory consumption to a minimum.
|
||||
* **Made for [Home Assistant](https://www.home-assistant.io):** Almost all [Home Assistant](https://www.home-assistant.io) features are supported out of the box. Including RGB lights and many more.
|
||||
* **Easy reproducible configuration:** No need to go through a long setup process for every single node. Just copy a configuration file and run a single command.
|
||||
* **Smart Over The Air Updates:** esphomeyaml has OTA updates deeply integrated into the system. It even automatically enters a recovery mode if a boot loop is detected.
|
||||
* **Powerful logging engine:** View colorful logs and debug issues remotely.
|
||||
* **Open Source**
|
||||
* For me: Makes documenting esphomelib's features a lot easier.
|
||||
|
||||
## Special Thanks
|
||||
|
||||
Special Thanks to the Home Assistant project. Lots of the code base of esphomeyaml is based off of Home Assistant, for example the loading and config validation code.
|
||||
|
||||
@@ -1,10 +0,0 @@
|
||||
ARG BUILD_FROM=esphome/esphome-base-amd64:1.3.0
|
||||
FROM ${BUILD_FROM}
|
||||
|
||||
COPY . .
|
||||
RUN \
|
||||
pip2 install --no-cache-dir --no-binary :all: -e .
|
||||
|
||||
WORKDIR /config
|
||||
ENTRYPOINT ["esphome"]
|
||||
CMD ["/config", "dashboard"]
|
||||
30
docker/Dockerfile.builder
Normal file
30
docker/Dockerfile.builder
Normal file
@@ -0,0 +1,30 @@
|
||||
FROM multiarch/ubuntu-core:amd64-xenial
|
||||
|
||||
# setup locals
|
||||
RUN apt-get update && apt-get install -y \
|
||||
jq \
|
||||
git \
|
||||
python3-setuptools \
|
||||
&& rm -rf /var/lib/apt/lists/* \
|
||||
ENV LANG C.UTF-8
|
||||
|
||||
# Install docker
|
||||
# https://docs.docker.com/engine/installation/linux/docker-ce/ubuntu/
|
||||
RUN apt-get update && apt-get install -y \
|
||||
apt-transport-https \
|
||||
ca-certificates \
|
||||
curl \
|
||||
software-properties-common \
|
||||
&& rm -rf /var/lib/apt/lists/* \
|
||||
&& curl -fsSL https://download.docker.com/linux/ubuntu/gpg | apt-key add - \
|
||||
&& add-apt-repository "deb https://download.docker.com/linux/ubuntu $(lsb_release -cs) stable" \
|
||||
&& apt-get update && apt-get install -y docker-ce \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
# setup arm binary support
|
||||
RUN apt-get update && apt-get install -y \
|
||||
qemu-user-static \
|
||||
binfmt-support \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
WORKDIR /data
|
||||
@@ -1,20 +1,42 @@
|
||||
ARG BUILD_FROM=esphome/esphome-hassio-base-amd64:1.3.0
|
||||
# Dockerfile for HassIO add-on
|
||||
ARG BUILD_FROM=homeassistant/amd64-base-ubuntu:latest
|
||||
FROM ${BUILD_FROM}
|
||||
|
||||
# Copy root filesystem
|
||||
COPY docker/rootfs/ /
|
||||
COPY setup.py setup.cfg MANIFEST.in /opt/esphome/
|
||||
COPY esphome /opt/esphome/esphome
|
||||
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||
python \
|
||||
python-pip \
|
||||
python-setuptools \
|
||||
python-pil \
|
||||
git \
|
||||
&& apt-get clean && rm -rf /var/lib/apt/lists/* /tmp/* && \
|
||||
pip install --no-cache-dir --no-binary :all: platformio && \
|
||||
platformio settings set enable_telemetry No && \
|
||||
platformio settings set check_libraries_interval 1000000 && \
|
||||
platformio settings set check_platformio_interval 1000000 && \
|
||||
platformio settings set check_platforms_interval 1000000
|
||||
|
||||
RUN \
|
||||
pip2 install --no-cache-dir --no-binary :all: -e /opt/esphome
|
||||
COPY docker/platformio.ini /pio/platformio.ini
|
||||
RUN platformio run -d /pio; rm -rf /pio
|
||||
|
||||
# Build arguments
|
||||
ARG BUILD_VERSION=dev
|
||||
ARG ESPHOMELIB_VERSION="dev"
|
||||
RUN platformio lib -g install "https://github.com/OttoWinter/esphomelib.git#${ESPHOMELIB_VERSION}"
|
||||
|
||||
COPY . .
|
||||
RUN pip install --no-cache-dir --no-binary :all: -e . && \
|
||||
pip install --no-cache-dir --no-binary :all: tzlocal
|
||||
|
||||
CMD ["esphomeyaml", "/config/esphomeyaml", "dashboard"]
|
||||
|
||||
# Build arugments
|
||||
ARG ADDON_ARCH
|
||||
ARG ADDON_VERSION
|
||||
|
||||
# Labels
|
||||
LABEL \
|
||||
io.hass.name="ESPHome" \
|
||||
io.hass.description="Manage and program ESP8266/ESP32 microcontrollers through YAML configuration files" \
|
||||
io.hass.name="esphomeyaml" \
|
||||
io.hass.description="esphomeyaml HassIO add-on for intelligently managing all your ESP8266/ESP32 devices." \
|
||||
io.hass.arch="${ADDON_ARCH}" \
|
||||
io.hass.type="addon" \
|
||||
io.hass.version=${BUILD_VERSION}
|
||||
io.hass.version="${ADDON_VERSION}" \
|
||||
io.hass.url="https://esphomelib.com/esphomeyaml/index.html" \
|
||||
maintainer="Otto Winter <contact@otto-winter.com>"
|
||||
|
||||
@@ -3,4 +3,4 @@ FROM python:2.7
|
||||
COPY requirements.txt /requirements.txt
|
||||
|
||||
RUN pip install -r /requirements.txt && \
|
||||
pip install flake8==3.6.0 pylint==1.9.4 pillow
|
||||
pip install flake8==3.5.0 pylint==1.9.3 tzlocal pillow
|
||||
|
||||
@@ -8,14 +8,12 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||
git \
|
||||
&& apt-get clean && rm -rf /var/lib/apt/lists/* /tmp/*rm -rf /var/lib/apt/lists/* /tmp/* && \
|
||||
pip install --no-cache-dir --no-binary :all: platformio && \
|
||||
platformio settings set enable_telemetry No && \
|
||||
platformio settings set check_libraries_interval 1000000 && \
|
||||
platformio settings set check_platformio_interval 1000000 && \
|
||||
platformio settings set check_platforms_interval 1000000
|
||||
platformio settings set enable_telemetry No
|
||||
|
||||
COPY docker/platformio.ini /pio/platformio.ini
|
||||
RUN platformio run -d /pio; rm -rf /pio
|
||||
|
||||
COPY requirements.txt /requirements.txt
|
||||
|
||||
RUN pip install --no-cache-dir -r /requirements.txt
|
||||
RUN pip install --no-cache-dir -r /requirements.txt && \
|
||||
pip install --no-cache-dir tzlocal pillow
|
||||
|
||||
@@ -1,26 +0,0 @@
|
||||
#!/usr/bin/env bash
|
||||
|
||||
# the Docker repository tag being built.
|
||||
declare CACHE_TAG
|
||||
echo "CACHE_TAG: ${CACHE_TAG}"
|
||||
# the name and tag of the Docker repository being built. (This variable is a combination of DOCKER_REPO:CACHE_TAG.)
|
||||
declare IMAGE_NAME
|
||||
echo "IMAGE_NAME: ${IMAGE_NAME}"
|
||||
# the architecture to build
|
||||
declare BUILD_ARCH
|
||||
echo "BUILD_ARCH: ${BUILD_ARCH}"
|
||||
# whether this is a hassio build
|
||||
declare IS_HASSIO
|
||||
echo "IS_HASSIO: ${IS_HASSIO}"
|
||||
echo "PWD: $PWD"
|
||||
|
||||
if [[ ${IS_HASSIO} = "YES" ]]; then
|
||||
docker build \
|
||||
--build-arg "BUILD_FROM=esphome/esphome-hassio-base-${BUILD_ARCH}:1.3.0" \
|
||||
--build-arg "BUILD_VERSION=${CACHE_TAG}" \
|
||||
-t "${IMAGE_NAME}" -f ../docker/Dockerfile.hassio ..
|
||||
else
|
||||
docker build \
|
||||
--build-arg "BUILD_FROM=esphome/esphome-base-${BUILD_ARCH}:1.3.0" \
|
||||
-t "${IMAGE_NAME}" -f ../docker/Dockerfile ..
|
||||
fi
|
||||
@@ -1,18 +0,0 @@
|
||||
#!/usr/bin/env bash
|
||||
|
||||
# the architecture to build
|
||||
declare BUILD_ARCH
|
||||
|
||||
echo "BUILD_ARCH: ${BUILD_ARCH}"
|
||||
|
||||
if [[ ${BUILD_ARCH} = "amd64" ]]; then
|
||||
echo "No qemu required..."
|
||||
exit 0
|
||||
fi
|
||||
if [[ ${BUILD_ARCH} = "i386" ]]; then
|
||||
echo "No qemu required..."
|
||||
exit 0
|
||||
fi
|
||||
|
||||
echo "Installing qemu..."
|
||||
docker run --rm --privileged multiarch/qemu-user-static:register --reset
|
||||
@@ -2,11 +2,11 @@
|
||||
; platforms
|
||||
|
||||
[env:espressif8266]
|
||||
platform = espressif8266@1.8.0
|
||||
platform = espressif8266
|
||||
board = nodemcuv2
|
||||
framework = arduino
|
||||
|
||||
[env:espressif32]
|
||||
platform = espressif32@1.5.0
|
||||
platform = espressif32
|
||||
board = nodemcu-32s
|
||||
framework = arduino
|
||||
|
||||
@@ -1,35 +0,0 @@
|
||||
#!/usr/bin/with-contenv bash
|
||||
# ==============================================================================
|
||||
# Community Hass.io Add-ons: ESPHome
|
||||
# This files check if all user configuration requirements are met
|
||||
# ==============================================================================
|
||||
# shellcheck disable=SC1091
|
||||
source /usr/lib/hassio-addons/base.sh
|
||||
|
||||
# Check SSL requirements, if enabled
|
||||
if hass.config.true 'ssl'; then
|
||||
if ! hass.config.has_value 'certfile'; then
|
||||
hass.die 'SSL is enabled, but no certfile was specified.'
|
||||
fi
|
||||
|
||||
if ! hass.config.has_value 'keyfile'; then
|
||||
hass.die 'SSL is enabled, but no keyfile was specified'
|
||||
fi
|
||||
|
||||
if ! hass.file_exists "/ssl/$(hass.config.get 'certfile')"; then
|
||||
if ! hass.file_exists "/ssl/$(hass.config.get 'keyfile')"; then
|
||||
# Both files are missing, let's print a friendlier error message
|
||||
text="You enabled encrypted connections using the \"ssl\": true option.
|
||||
However, the SSL files \"$(hass.config.get 'certfile')\" and \"$(hass.config.get 'keyfile')\"
|
||||
were not found. If you're using Hass.io on your local network and don't want
|
||||
to encrypt connections to the ESPHome dashboard, you can manually disable
|
||||
SSL by setting \"ssl\" to false."
|
||||
hass.die "${text}"
|
||||
fi
|
||||
hass.die 'The configured certfile is not found'
|
||||
fi
|
||||
|
||||
if ! hass.file_exists "/ssl/$(hass.config.get 'keyfile')"; then
|
||||
hass.die 'The configured keyfile is not found'
|
||||
fi
|
||||
fi
|
||||
@@ -1,28 +0,0 @@
|
||||
#!/usr/bin/with-contenv bash
|
||||
# ==============================================================================
|
||||
# Community Hass.io Add-ons: ESPHome
|
||||
# Configures NGINX for use with ESPHome
|
||||
# ==============================================================================
|
||||
# shellcheck disable=SC1091
|
||||
source /usr/lib/hassio-addons/base.sh
|
||||
|
||||
declare certfile
|
||||
declare keyfile
|
||||
declare port
|
||||
|
||||
mkdir -p /var/log/nginx
|
||||
|
||||
# Enable SSL
|
||||
if hass.config.true 'ssl'; then
|
||||
rm /etc/nginx/nginx.conf
|
||||
mv /etc/nginx/nginx-ssl.conf /etc/nginx/nginx.conf
|
||||
|
||||
certfile=$(hass.config.get 'certfile')
|
||||
keyfile=$(hass.config.get 'keyfile')
|
||||
|
||||
sed -i "s/%%certfile%%/${certfile}/g" /etc/nginx/nginx.conf
|
||||
sed -i "s/%%keyfile%%/${keyfile}/g" /etc/nginx/nginx.conf
|
||||
fi
|
||||
|
||||
port=$(hass.config.get 'port')
|
||||
sed -i "s/%%port%%/${port}/g" /etc/nginx/nginx.conf
|
||||
@@ -1,14 +0,0 @@
|
||||
#!/usr/bin/with-contenv bash
|
||||
# ==============================================================================
|
||||
# Community Hass.io Add-ons: ESPHome
|
||||
# This files installs the user ESPHome version if specified
|
||||
# ==============================================================================
|
||||
# shellcheck disable=SC1091
|
||||
source /usr/lib/hassio-addons/base.sh
|
||||
|
||||
declare esphome_version
|
||||
|
||||
if hass.config.has_value 'esphome_version'; then
|
||||
esphome_version=$(hass.config.get 'esphome_version')
|
||||
pip2 install --no-cache-dir --no-binary :all: "https://github.com/esphome/esphome/archive/${esphome_version}.zip"
|
||||
fi
|
||||
@@ -1,13 +0,0 @@
|
||||
#!/usr/bin/with-contenv bash
|
||||
# ==============================================================================
|
||||
# Community Hass.io Add-ons: ESPHome
|
||||
# This files migrates the esphome config directory from the old path
|
||||
# ==============================================================================
|
||||
# shellcheck disable=SC1091
|
||||
source /usr/lib/hassio-addons/base.sh
|
||||
|
||||
if [[ ! -d /config/esphome && -d /config/esphomeyaml ]]; then
|
||||
echo "Moving config directory from /config/esphomeyaml to /config/esphome"
|
||||
mv /config/esphomeyaml /config/esphome
|
||||
mv /config/esphome/.esphomeyaml /config/esphome/.esphome
|
||||
fi
|
||||
@@ -1,62 +0,0 @@
|
||||
worker_processes 1;
|
||||
pid /var/run/nginx.pid;
|
||||
error_log stderr;
|
||||
|
||||
events {
|
||||
worker_connections 1024;
|
||||
}
|
||||
|
||||
http {
|
||||
access_log stdout;
|
||||
include mime.types;
|
||||
default_type application/octet-stream;
|
||||
sendfile on;
|
||||
keepalive_timeout 65;
|
||||
|
||||
upstream esphome {
|
||||
ip_hash;
|
||||
server unix:/var/run/esphome.sock;
|
||||
}
|
||||
map $http_upgrade $connection_upgrade {
|
||||
default upgrade;
|
||||
'' close;
|
||||
}
|
||||
|
||||
server {
|
||||
server_name hassio.local;
|
||||
listen %%port%% default_server ssl;
|
||||
root /dev/null;
|
||||
|
||||
ssl_certificate /ssl/%%certfile%%;
|
||||
ssl_certificate_key /ssl/%%keyfile%%;
|
||||
ssl_protocols TLSv1.2;
|
||||
ssl_prefer_server_ciphers on;
|
||||
ssl_ciphers ECDHE-RSA-AES256-GCM-SHA512:DHE-RSA-AES256-GCM-SHA512:ECDHE-RSA-AES256-GCM-SHA384:DHE-RSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-SHA384:ECDHE-RSA-AES256-SHA:DHE-RSA-AES256-SHA;
|
||||
ssl_ecdh_curve secp384r1;
|
||||
ssl_session_timeout 10m;
|
||||
ssl_session_cache shared:SSL:10m;
|
||||
ssl_session_tickets off;
|
||||
ssl_stapling on;
|
||||
ssl_stapling_verify on;
|
||||
|
||||
# Redirect http requests to https on the same port.
|
||||
# https://rageagainstshell.com/2016/11/redirect-http-to-https-on-the-same-port-in-nginx/
|
||||
error_page 497 https://$http_host$request_uri;
|
||||
|
||||
location / {
|
||||
proxy_redirect off;
|
||||
proxy_pass http://esphome;
|
||||
|
||||
proxy_http_version 1.1;
|
||||
proxy_set_header Upgrade $http_upgrade;
|
||||
proxy_set_header Connection $connection_upgrade;
|
||||
proxy_set_header Authorization "";
|
||||
|
||||
proxy_set_header X-Real-IP $remote_addr;
|
||||
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||
proxy_set_header X-Forwarded-Proto $scheme;
|
||||
proxy_set_header Host $http_host;
|
||||
proxy_set_header X-NginX-Proxy true;
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,46 +0,0 @@
|
||||
worker_processes 1;
|
||||
pid /var/run/nginx.pid;
|
||||
error_log stderr;
|
||||
|
||||
events {
|
||||
worker_connections 1024;
|
||||
}
|
||||
|
||||
http {
|
||||
access_log stdout;
|
||||
include mime.types;
|
||||
default_type application/octet-stream;
|
||||
sendfile on;
|
||||
keepalive_timeout 65;
|
||||
|
||||
upstream esphome {
|
||||
ip_hash;
|
||||
server unix:/var/run/esphome.sock;
|
||||
}
|
||||
map $http_upgrade $connection_upgrade {
|
||||
default upgrade;
|
||||
'' close;
|
||||
}
|
||||
|
||||
server {
|
||||
server_name hassio.local;
|
||||
listen %%port%% default_server;
|
||||
root /dev/null;
|
||||
|
||||
location / {
|
||||
proxy_redirect off;
|
||||
proxy_pass http://esphome;
|
||||
|
||||
proxy_http_version 1.1;
|
||||
proxy_set_header Upgrade $http_upgrade;
|
||||
proxy_set_header Connection $connection_upgrade;
|
||||
proxy_set_header Authorization "";
|
||||
|
||||
proxy_set_header X-Real-IP $remote_addr;
|
||||
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||
proxy_set_header X-Forwarded-Proto $scheme;
|
||||
proxy_set_header Host $http_host;
|
||||
proxy_set_header X-NginX-Proxy true;
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,9 +0,0 @@
|
||||
#!/usr/bin/execlineb -S0
|
||||
# ==============================================================================
|
||||
# Community Hass.io Add-ons: ESPHome
|
||||
# Take down the S6 supervision tree when ESPHome fails
|
||||
# ==============================================================================
|
||||
if -n { s6-test $# -ne 0 }
|
||||
if -n { s6-test ${1} -eq 256 }
|
||||
|
||||
s6-svscanctl -t /var/run/s6/services
|
||||
@@ -1,28 +0,0 @@
|
||||
#!/usr/bin/with-contenv bash
|
||||
# ==============================================================================
|
||||
# Community Hass.io Add-ons: ESPHome
|
||||
# Runs the ESPHome dashboard
|
||||
# ==============================================================================
|
||||
# shellcheck disable=SC1091
|
||||
source /usr/lib/hassio-addons/base.sh
|
||||
|
||||
export ESPHOME_IS_HASSIO=true
|
||||
|
||||
if hass.config.true 'leave_front_door_open'; then
|
||||
export DISABLE_HA_AUTHENTICATION=true
|
||||
fi
|
||||
|
||||
if hass.config.true 'streamer_mode'; then
|
||||
export ESPHOME_STREAMER_MODE=true
|
||||
fi
|
||||
|
||||
if hass.config.true 'status_use_ping'; then
|
||||
export ESPHOME_DASHBOARD_USE_PING=true
|
||||
fi
|
||||
|
||||
if hass.config.has_value 'relative_url'; then
|
||||
export ESPHOME_DASHBOARD_RELATIVE_URL=$(hass.config.get 'relative_url')
|
||||
fi
|
||||
|
||||
hass.log.info "Starting ESPHome dashboard..."
|
||||
exec esphome /config/esphome dashboard --socket /var/run/esphome.sock --hassio
|
||||
@@ -1,9 +0,0 @@
|
||||
#!/usr/bin/execlineb -S0
|
||||
# ==============================================================================
|
||||
# Community Hass.io Add-ons: ESPHome
|
||||
# Take down the S6 supervision tree when NGINX fails
|
||||
# ==============================================================================
|
||||
if -n { s6-test $# -ne 0 }
|
||||
if -n { s6-test ${1} -eq 256 }
|
||||
|
||||
s6-svscanctl -t /var/run/s6/services
|
||||
@@ -1,10 +0,0 @@
|
||||
#!/usr/bin/with-contenv bash
|
||||
# ==============================================================================
|
||||
# Community Hass.io Add-ons: ESPHome
|
||||
# Runs the NGINX proxy
|
||||
# ==============================================================================
|
||||
# shellcheck disable=SC1091
|
||||
source /usr/lib/hassio-addons/base.sh
|
||||
|
||||
hass.log.info "Starting NGINX..."
|
||||
exec nginx -g "daemon off;"
|
||||
@@ -1,330 +0,0 @@
|
||||
syntax = "proto3";
|
||||
|
||||
// The Home Assistant protocol is structured as a simple
|
||||
// TCP socket with short binary messages encoded in the protocol buffers format
|
||||
// First, a message in this protocol has a specific format:
|
||||
// * VarInt denoting the size of the message object. (type is not part of this)
|
||||
// * VarInt denoting the type of message.
|
||||
// * The message object encoded as a ProtoBuf message
|
||||
|
||||
// The connection is established in 4 steps:
|
||||
// * First, the client connects to the server and sends a "Hello Request" identifying itself
|
||||
// * The server responds with a "Hello Response" and selects the protocol version
|
||||
// * After receiving this message, the client attempts to authenticate itself using
|
||||
// the password and a "Connect Request"
|
||||
// * The server responds with a "Connect Response" and notifies of invalid password.
|
||||
// If anything in this initial process fails, the connection must immediately closed
|
||||
// by both sides and _no_ disconnection message is to be sent.
|
||||
|
||||
// Message sent at the beginning of each connection
|
||||
// Can only be sent by the client and only at the beginning of the connection
|
||||
message HelloRequest {
|
||||
// Description of client (like User Agent)
|
||||
// For example "Home Assistant"
|
||||
// Not strictly necessary to send but nice for debugging
|
||||
// purposes.
|
||||
string client_info = 1;
|
||||
}
|
||||
|
||||
// Confirmation of successful connection request.
|
||||
// Can only be sent by the server and only at the beginning of the connection
|
||||
message HelloResponse {
|
||||
// The version of the API to use. The _client_ (for example Home Assistant) needs to check
|
||||
// for compatibility and if necessary adopt to an older API.
|
||||
// Major is for breaking changes in the base protocol - a mismatch will lead to immediate disconnect_client_
|
||||
// Minor is for breaking changes in individual messages - a mismatch will lead to a warning message
|
||||
uint32 api_version_major = 1;
|
||||
uint32 api_version_minor = 2;
|
||||
|
||||
// A string identifying the server (ESP); like client info this may be empty
|
||||
// and only exists for debugging/logging purposes.
|
||||
// For example "ESPHome v1.10.0 on ESP8266"
|
||||
string server_info = 3;
|
||||
}
|
||||
|
||||
// Message sent at the beginning of each connection to authenticate the client
|
||||
// Can only be sent by the client and only at the beginning of the connection
|
||||
message ConnectRequest {
|
||||
// The password to log in with
|
||||
string password = 1;
|
||||
}
|
||||
|
||||
// Confirmation of successful connection. After this the connection is available for all traffic.
|
||||
// Can only be sent by the server and only at the beginning of the connection
|
||||
message ConnectResponse {
|
||||
bool invalid_password = 1;
|
||||
}
|
||||
|
||||
// Request to close the connection.
|
||||
// Can be sent by both the client and server
|
||||
message DisconnectRequest {
|
||||
// Do not close the connection before the acknowledgement arrives
|
||||
}
|
||||
|
||||
message DisconnectResponse {
|
||||
// Empty - Both parties are required to close the connection after this
|
||||
// message has been received.
|
||||
}
|
||||
|
||||
message PingRequest {
|
||||
// Empty
|
||||
}
|
||||
|
||||
message PingResponse {
|
||||
// Empty
|
||||
}
|
||||
|
||||
message DeviceInfoRequest {
|
||||
// Empty
|
||||
}
|
||||
|
||||
message DeviceInfoResponse {
|
||||
bool uses_password = 1;
|
||||
|
||||
// The name of the node, given by "App.set_name()"
|
||||
string name = 2;
|
||||
|
||||
// The mac address of the device. For example "AC:BC:32:89:0E:A9"
|
||||
string mac_address = 3;
|
||||
|
||||
// A string describing the ESPHome version. For example "1.10.0"
|
||||
string esphome_core_version = 4;
|
||||
|
||||
// A string describing the date of compilation, this is generated by the compiler
|
||||
// and therefore may not be in the same format all the time.
|
||||
// If the user isn't using esphome, this will also not be set.
|
||||
string compilation_time = 5;
|
||||
|
||||
// The model of the board. For example NodeMCU
|
||||
string model = 6;
|
||||
|
||||
bool has_deep_sleep = 7;
|
||||
}
|
||||
|
||||
message ListEntitiesRequest {
|
||||
// Empty
|
||||
}
|
||||
|
||||
message ListEntitiesBinarySensorResponse {
|
||||
string object_id = 1;
|
||||
fixed32 key = 2;
|
||||
string name = 3;
|
||||
string unique_id = 4;
|
||||
|
||||
string device_class = 5;
|
||||
bool is_status_binary_sensor = 6;
|
||||
}
|
||||
message ListEntitiesCoverResponse {
|
||||
string object_id = 1;
|
||||
fixed32 key = 2;
|
||||
string name = 3;
|
||||
string unique_id = 4;
|
||||
|
||||
bool is_optimistic = 5;
|
||||
}
|
||||
message ListEntitiesFanResponse {
|
||||
string object_id = 1;
|
||||
fixed32 key = 2;
|
||||
string name = 3;
|
||||
string unique_id = 4;
|
||||
|
||||
bool supports_oscillation = 5;
|
||||
bool supports_speed = 6;
|
||||
}
|
||||
message ListEntitiesLightResponse {
|
||||
string object_id = 1;
|
||||
fixed32 key = 2;
|
||||
string name = 3;
|
||||
string unique_id = 4;
|
||||
|
||||
bool supports_brightness = 5;
|
||||
bool supports_rgb = 6;
|
||||
bool supports_white_value = 7;
|
||||
bool supports_color_temperature = 8;
|
||||
float min_mireds = 9;
|
||||
float max_mireds = 10;
|
||||
repeated string effects = 11;
|
||||
}
|
||||
message ListEntitiesSensorResponse {
|
||||
string object_id = 1;
|
||||
fixed32 key = 2;
|
||||
string name = 3;
|
||||
string unique_id = 4;
|
||||
|
||||
string icon = 5;
|
||||
string unit_of_measurement = 6;
|
||||
int32 accuracy_decimals = 7;
|
||||
}
|
||||
message ListEntitiesSwitchResponse {
|
||||
string object_id = 1;
|
||||
fixed32 key = 2;
|
||||
string name = 3;
|
||||
string unique_id = 4;
|
||||
|
||||
string icon = 5;
|
||||
bool optimistic = 6;
|
||||
}
|
||||
message ListEntitiesTextSensorResponse {
|
||||
string object_id = 1;
|
||||
fixed32 key = 2;
|
||||
string name = 3;
|
||||
string unique_id = 4;
|
||||
|
||||
string icon = 5;
|
||||
}
|
||||
message ListEntitiesDoneResponse {
|
||||
// Empty
|
||||
}
|
||||
|
||||
message SubscribeStatesRequest {
|
||||
// Empty
|
||||
}
|
||||
message BinarySensorStateResponse {
|
||||
fixed32 key = 1;
|
||||
bool state = 2;
|
||||
}
|
||||
message CoverStateResponse {
|
||||
fixed32 key = 1;
|
||||
enum CoverState {
|
||||
OPEN = 0;
|
||||
CLOSED = 1;
|
||||
}
|
||||
CoverState state = 2;
|
||||
}
|
||||
enum FanSpeed {
|
||||
LOW = 0;
|
||||
MEDIUM = 1;
|
||||
HIGH = 2;
|
||||
}
|
||||
message FanStateResponse {
|
||||
fixed32 key = 1;
|
||||
bool state = 2;
|
||||
bool oscillating = 3;
|
||||
FanSpeed speed = 4;
|
||||
}
|
||||
message LightStateResponse {
|
||||
fixed32 key = 1;
|
||||
bool state = 2;
|
||||
float brightness = 3;
|
||||
float red = 4;
|
||||
float green = 5;
|
||||
float blue = 6;
|
||||
float white = 7;
|
||||
float color_temperature = 8;
|
||||
string effect = 9;
|
||||
}
|
||||
message SensorStateResponse {
|
||||
fixed32 key = 1;
|
||||
float state = 2;
|
||||
}
|
||||
message SwitchStateResponse {
|
||||
fixed32 key = 1;
|
||||
bool state = 2;
|
||||
}
|
||||
message TextSensorStateResponse {
|
||||
fixed32 key = 1;
|
||||
string state = 2;
|
||||
}
|
||||
|
||||
message CoverCommandRequest {
|
||||
fixed32 key = 1;
|
||||
enum CoverCommand {
|
||||
OPEN = 0;
|
||||
CLOSE = 1;
|
||||
STOP = 2;
|
||||
}
|
||||
bool has_state = 2;
|
||||
CoverCommand command = 3;
|
||||
}
|
||||
message FanCommandRequest {
|
||||
fixed32 key = 1;
|
||||
bool has_state = 2;
|
||||
bool state = 3;
|
||||
bool has_speed = 4;
|
||||
FanSpeed speed = 5;
|
||||
bool has_oscillating = 6;
|
||||
bool oscillating = 7;
|
||||
}
|
||||
message LightCommandRequest {
|
||||
fixed32 key = 1;
|
||||
bool has_state = 2;
|
||||
bool state = 3;
|
||||
bool has_brightness = 4;
|
||||
float brightness = 5;
|
||||
bool has_rgb = 6;
|
||||
float red = 7;
|
||||
float green = 8;
|
||||
float blue = 9;
|
||||
bool has_white = 10;
|
||||
float white = 11;
|
||||
bool has_color_temperature = 12;
|
||||
float color_temperature = 13;
|
||||
bool has_transition_length = 14;
|
||||
uint32 transition_length = 15;
|
||||
bool has_flash_length = 16;
|
||||
uint32 flash_length = 17;
|
||||
bool has_effect = 18;
|
||||
string effect = 19;
|
||||
}
|
||||
message SwitchCommandRequest {
|
||||
fixed32 key = 1;
|
||||
bool state = 2;
|
||||
}
|
||||
|
||||
enum LogLevel {
|
||||
NONE = 0;
|
||||
ERROR = 1;
|
||||
WARN = 2;
|
||||
INFO = 3;
|
||||
DEBUG = 4;
|
||||
VERBOSE = 5;
|
||||
VERY_VERBOSE = 6;
|
||||
}
|
||||
|
||||
message SubscribeLogsRequest {
|
||||
LogLevel level = 1;
|
||||
bool dump_config = 2;
|
||||
}
|
||||
|
||||
message SubscribeLogsResponse {
|
||||
LogLevel level = 1;
|
||||
string tag = 2;
|
||||
string message = 3;
|
||||
bool send_failed = 4;
|
||||
}
|
||||
|
||||
message SubscribeServiceCallsRequest {
|
||||
|
||||
}
|
||||
|
||||
message ServiceCallResponse {
|
||||
string service = 1;
|
||||
map<string, string> data = 2;
|
||||
map<string, string> data_template = 3;
|
||||
map<string, string> variables = 4;
|
||||
}
|
||||
|
||||
// 1. Client sends SubscribeHomeAssistantStatesRequest
|
||||
// 2. Server responds with zero or more SubscribeHomeAssistantStateResponse (async)
|
||||
// 3. Client sends HomeAssistantStateResponse for state changes.
|
||||
message SubscribeHomeAssistantStatesRequest {
|
||||
|
||||
}
|
||||
|
||||
message SubscribeHomeAssistantStateResponse {
|
||||
string entity_id = 1;
|
||||
}
|
||||
|
||||
message HomeAssistantStateResponse {
|
||||
string entity_id = 1;
|
||||
string state = 2;
|
||||
}
|
||||
|
||||
message GetTimeRequest {
|
||||
|
||||
}
|
||||
|
||||
message GetTimeResponse {
|
||||
fixed32 epoch_seconds = 1;
|
||||
}
|
||||
|
||||
File diff suppressed because one or more lines are too long
@@ -1,495 +0,0 @@
|
||||
from datetime import datetime
|
||||
import functools
|
||||
import logging
|
||||
import socket
|
||||
import threading
|
||||
import time
|
||||
|
||||
# pylint: disable=unused-import
|
||||
from typing import Optional # noqa
|
||||
from google.protobuf import message # noqa
|
||||
|
||||
from esphome import const
|
||||
import esphome.api.api_pb2 as pb
|
||||
from esphome.const import CONF_PASSWORD, CONF_PORT
|
||||
from esphome.core import EsphomeError
|
||||
from esphome.helpers import resolve_ip_address, indent, color
|
||||
from esphome.py_compat import text_type, IS_PY2, byte_to_bytes, char_to_byte, format_bytes
|
||||
from esphome.util import safe_print
|
||||
|
||||
_LOGGER = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class APIConnectionError(EsphomeError):
|
||||
pass
|
||||
|
||||
|
||||
MESSAGE_TYPE_TO_PROTO = {
|
||||
1: pb.HelloRequest,
|
||||
2: pb.HelloResponse,
|
||||
3: pb.ConnectRequest,
|
||||
4: pb.ConnectResponse,
|
||||
5: pb.DisconnectRequest,
|
||||
6: pb.DisconnectResponse,
|
||||
7: pb.PingRequest,
|
||||
8: pb.PingResponse,
|
||||
9: pb.DeviceInfoRequest,
|
||||
10: pb.DeviceInfoResponse,
|
||||
11: pb.ListEntitiesRequest,
|
||||
12: pb.ListEntitiesBinarySensorResponse,
|
||||
13: pb.ListEntitiesCoverResponse,
|
||||
14: pb.ListEntitiesFanResponse,
|
||||
15: pb.ListEntitiesLightResponse,
|
||||
16: pb.ListEntitiesSensorResponse,
|
||||
17: pb.ListEntitiesSwitchResponse,
|
||||
18: pb.ListEntitiesTextSensorResponse,
|
||||
19: pb.ListEntitiesDoneResponse,
|
||||
20: pb.SubscribeStatesRequest,
|
||||
21: pb.BinarySensorStateResponse,
|
||||
22: pb.CoverStateResponse,
|
||||
23: pb.FanStateResponse,
|
||||
24: pb.LightStateResponse,
|
||||
25: pb.SensorStateResponse,
|
||||
26: pb.SwitchStateResponse,
|
||||
27: pb.TextSensorStateResponse,
|
||||
28: pb.SubscribeLogsRequest,
|
||||
29: pb.SubscribeLogsResponse,
|
||||
30: pb.CoverCommandRequest,
|
||||
31: pb.FanCommandRequest,
|
||||
32: pb.LightCommandRequest,
|
||||
33: pb.SwitchCommandRequest,
|
||||
34: pb.SubscribeServiceCallsRequest,
|
||||
35: pb.ServiceCallResponse,
|
||||
36: pb.GetTimeRequest,
|
||||
37: pb.GetTimeResponse,
|
||||
}
|
||||
|
||||
|
||||
def _varuint_to_bytes(value):
|
||||
if value <= 0x7F:
|
||||
return byte_to_bytes(value)
|
||||
|
||||
ret = bytes()
|
||||
while value:
|
||||
temp = value & 0x7F
|
||||
value >>= 7
|
||||
if value:
|
||||
ret += byte_to_bytes(temp | 0x80)
|
||||
else:
|
||||
ret += byte_to_bytes(temp)
|
||||
|
||||
return ret
|
||||
|
||||
|
||||
def _bytes_to_varuint(value):
|
||||
result = 0
|
||||
bitpos = 0
|
||||
for c in value:
|
||||
val = char_to_byte(c)
|
||||
result |= (val & 0x7F) << bitpos
|
||||
bitpos += 7
|
||||
if (val & 0x80) == 0:
|
||||
return result
|
||||
return None
|
||||
|
||||
|
||||
# pylint: disable=too-many-instance-attributes,not-callable
|
||||
class APIClient(threading.Thread):
|
||||
def __init__(self, address, port, password):
|
||||
threading.Thread.__init__(self)
|
||||
self._address = address # type: str
|
||||
self._port = port # type: int
|
||||
self._password = password # type: Optional[str]
|
||||
self._socket = None # type: Optional[socket.socket]
|
||||
self._socket_open_event = threading.Event()
|
||||
self._socket_write_lock = threading.Lock()
|
||||
self._connected = False
|
||||
self._authenticated = False
|
||||
self._message_handlers = []
|
||||
self._keepalive = 5
|
||||
self._ping_timer = None
|
||||
self._refresh_ping()
|
||||
|
||||
self.on_disconnect = None
|
||||
self.on_connect = None
|
||||
self.on_login = None
|
||||
self.auto_reconnect = False
|
||||
self._running_event = threading.Event()
|
||||
self._stop_event = threading.Event()
|
||||
|
||||
@property
|
||||
def stopped(self):
|
||||
return self._stop_event.is_set()
|
||||
|
||||
def _refresh_ping(self):
|
||||
if self._ping_timer is not None:
|
||||
self._ping_timer.cancel()
|
||||
self._ping_timer = None
|
||||
|
||||
def func():
|
||||
self._ping_timer = None
|
||||
|
||||
if self._connected:
|
||||
try:
|
||||
self.ping()
|
||||
except APIConnectionError:
|
||||
self._fatal_error()
|
||||
else:
|
||||
self._refresh_ping()
|
||||
|
||||
self._ping_timer = threading.Timer(self._keepalive, func)
|
||||
self._ping_timer.start()
|
||||
|
||||
def _cancel_ping(self):
|
||||
if self._ping_timer is not None:
|
||||
self._ping_timer.cancel()
|
||||
self._ping_timer = None
|
||||
|
||||
def _close_socket(self):
|
||||
self._cancel_ping()
|
||||
if self._socket is not None:
|
||||
self._socket.close()
|
||||
self._socket = None
|
||||
self._socket_open_event.clear()
|
||||
self._connected = False
|
||||
self._authenticated = False
|
||||
self._message_handlers = []
|
||||
|
||||
def stop(self, force=False):
|
||||
if self.stopped:
|
||||
raise ValueError
|
||||
|
||||
if self._connected and not force:
|
||||
try:
|
||||
self.disconnect()
|
||||
except APIConnectionError:
|
||||
pass
|
||||
self._close_socket()
|
||||
|
||||
self._stop_event.set()
|
||||
if not force:
|
||||
self.join()
|
||||
|
||||
def connect(self):
|
||||
if not self._running_event.wait(0.1):
|
||||
raise APIConnectionError("You need to call start() first!")
|
||||
|
||||
if self._connected:
|
||||
raise APIConnectionError("Already connected!")
|
||||
|
||||
try:
|
||||
ip = resolve_ip_address(self._address)
|
||||
except EsphomeError as err:
|
||||
_LOGGER.warning("Error resolving IP address of %s. Is it connected to WiFi?",
|
||||
self._address)
|
||||
_LOGGER.warning("(If this error persists, please set a static IP address: "
|
||||
"https://esphome.io/components/wifi.html#manual-ips)")
|
||||
raise APIConnectionError(err)
|
||||
|
||||
_LOGGER.info("Connecting to %s:%s (%s)", self._address, self._port, ip)
|
||||
self._socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
|
||||
self._socket.settimeout(10.0)
|
||||
self._socket.setsockopt(socket.IPPROTO_TCP, socket.TCP_NODELAY, 1)
|
||||
try:
|
||||
self._socket.connect((ip, self._port))
|
||||
except socket.error as err:
|
||||
self._fatal_error()
|
||||
raise APIConnectionError("Error connecting to {}: {}".format(ip, err))
|
||||
self._socket.settimeout(0.1)
|
||||
|
||||
self._socket_open_event.set()
|
||||
|
||||
hello = pb.HelloRequest()
|
||||
hello.client_info = 'ESPHome v{}'.format(const.__version__)
|
||||
try:
|
||||
resp = self._send_message_await_response(hello, pb.HelloResponse)
|
||||
except APIConnectionError as err:
|
||||
self._fatal_error()
|
||||
raise err
|
||||
_LOGGER.debug("Successfully connected to %s ('%s' API=%s.%s)", self._address,
|
||||
resp.server_info, resp.api_version_major, resp.api_version_minor)
|
||||
self._connected = True
|
||||
if self.on_connect is not None:
|
||||
self.on_connect()
|
||||
|
||||
def _check_connected(self):
|
||||
if not self._connected:
|
||||
self._fatal_error()
|
||||
raise APIConnectionError("Must be connected!")
|
||||
|
||||
def login(self):
|
||||
self._check_connected()
|
||||
if self._authenticated:
|
||||
raise APIConnectionError("Already logged in!")
|
||||
|
||||
connect = pb.ConnectRequest()
|
||||
if self._password is not None:
|
||||
connect.password = self._password
|
||||
resp = self._send_message_await_response(connect, pb.ConnectResponse)
|
||||
if resp.invalid_password:
|
||||
raise APIConnectionError("Invalid password!")
|
||||
|
||||
self._authenticated = True
|
||||
if self.on_login is not None:
|
||||
self.on_login()
|
||||
|
||||
def _fatal_error(self):
|
||||
was_connected = self._connected
|
||||
|
||||
self._close_socket()
|
||||
|
||||
if was_connected and self.on_disconnect is not None:
|
||||
self.on_disconnect()
|
||||
|
||||
def _write(self, data): # type: (bytes) -> None
|
||||
if self._socket is None:
|
||||
raise APIConnectionError("Socket closed")
|
||||
|
||||
_LOGGER.debug("Write: %s", format_bytes(data))
|
||||
with self._socket_write_lock:
|
||||
try:
|
||||
self._socket.sendall(data)
|
||||
except socket.error as err:
|
||||
self._fatal_error()
|
||||
raise APIConnectionError("Error while writing data: {}".format(err))
|
||||
|
||||
def _send_message(self, msg):
|
||||
# type: (message.Message) -> None
|
||||
for message_type, klass in MESSAGE_TYPE_TO_PROTO.items():
|
||||
if isinstance(msg, klass):
|
||||
break
|
||||
else:
|
||||
raise ValueError
|
||||
|
||||
encoded = msg.SerializeToString()
|
||||
_LOGGER.debug("Sending %s:\n%s", type(msg), indent(text_type(msg)))
|
||||
if IS_PY2:
|
||||
req = chr(0x00)
|
||||
else:
|
||||
req = bytes([0])
|
||||
req += _varuint_to_bytes(len(encoded))
|
||||
req += _varuint_to_bytes(message_type)
|
||||
req += encoded
|
||||
self._write(req)
|
||||
self._refresh_ping()
|
||||
|
||||
def _send_message_await_response_complex(self, send_msg, do_append, do_stop, timeout=1):
|
||||
event = threading.Event()
|
||||
responses = []
|
||||
|
||||
def on_message(resp):
|
||||
if do_append(resp):
|
||||
responses.append(resp)
|
||||
if do_stop(resp):
|
||||
event.set()
|
||||
|
||||
self._message_handlers.append(on_message)
|
||||
self._send_message(send_msg)
|
||||
ret = event.wait(timeout)
|
||||
try:
|
||||
self._message_handlers.remove(on_message)
|
||||
except ValueError:
|
||||
pass
|
||||
if not ret:
|
||||
raise APIConnectionError("Timeout while waiting for message response!")
|
||||
return responses
|
||||
|
||||
def _send_message_await_response(self, send_msg, response_type, timeout=1):
|
||||
def is_response(msg):
|
||||
return isinstance(msg, response_type)
|
||||
|
||||
return self._send_message_await_response_complex(send_msg, is_response, is_response,
|
||||
timeout)[0]
|
||||
|
||||
def device_info(self):
|
||||
self._check_connected()
|
||||
return self._send_message_await_response(pb.DeviceInfoRequest(), pb.DeviceInfoResponse)
|
||||
|
||||
def ping(self):
|
||||
self._check_connected()
|
||||
return self._send_message_await_response(pb.PingRequest(), pb.PingResponse)
|
||||
|
||||
def disconnect(self):
|
||||
self._check_connected()
|
||||
|
||||
try:
|
||||
self._send_message_await_response(pb.DisconnectRequest(), pb.DisconnectResponse)
|
||||
except APIConnectionError:
|
||||
pass
|
||||
self._close_socket()
|
||||
|
||||
if self.on_disconnect is not None:
|
||||
self.on_disconnect()
|
||||
|
||||
def _check_authenticated(self):
|
||||
if not self._authenticated:
|
||||
raise APIConnectionError("Must login first!")
|
||||
|
||||
def subscribe_logs(self, on_log, log_level=None, dump_config=False):
|
||||
self._check_authenticated()
|
||||
|
||||
def on_msg(msg):
|
||||
if isinstance(msg, pb.SubscribeLogsResponse):
|
||||
on_log(msg)
|
||||
|
||||
self._message_handlers.append(on_msg)
|
||||
req = pb.SubscribeLogsRequest(dump_config=dump_config)
|
||||
if log_level is not None:
|
||||
req.level = log_level
|
||||
self._send_message(req)
|
||||
|
||||
def _recv(self, amount):
|
||||
ret = bytes()
|
||||
if amount == 0:
|
||||
return ret
|
||||
|
||||
while len(ret) < amount:
|
||||
if self.stopped:
|
||||
raise APIConnectionError("Stopped!")
|
||||
if not self._socket_open_event.is_set():
|
||||
raise APIConnectionError("No socket!")
|
||||
try:
|
||||
val = self._socket.recv(amount - len(ret))
|
||||
except AttributeError:
|
||||
raise APIConnectionError("Socket was closed")
|
||||
except socket.timeout:
|
||||
continue
|
||||
except socket.error as err:
|
||||
raise APIConnectionError("Error while receiving data: {}".format(err))
|
||||
ret += val
|
||||
return ret
|
||||
|
||||
def _recv_varint(self):
|
||||
raw = bytes()
|
||||
while not raw or char_to_byte(raw[-1]) & 0x80:
|
||||
raw += self._recv(1)
|
||||
return _bytes_to_varuint(raw)
|
||||
|
||||
def _run_once(self):
|
||||
if not self._socket_open_event.wait(0.1):
|
||||
return
|
||||
|
||||
# Preamble
|
||||
if char_to_byte(self._recv(1)[0]) != 0x00:
|
||||
raise APIConnectionError("Invalid preamble")
|
||||
|
||||
length = self._recv_varint()
|
||||
msg_type = self._recv_varint()
|
||||
|
||||
raw_msg = self._recv(length)
|
||||
if msg_type not in MESSAGE_TYPE_TO_PROTO:
|
||||
_LOGGER.debug("Skipping message type %s", msg_type)
|
||||
return
|
||||
|
||||
msg = MESSAGE_TYPE_TO_PROTO[msg_type]()
|
||||
msg.ParseFromString(raw_msg)
|
||||
_LOGGER.debug("Got message: %s:\n%s", type(msg), indent(str(msg)))
|
||||
for msg_handler in self._message_handlers[:]:
|
||||
msg_handler(msg)
|
||||
self._handle_internal_messages(msg)
|
||||
self._refresh_ping()
|
||||
|
||||
def run(self):
|
||||
self._running_event.set()
|
||||
while not self.stopped:
|
||||
try:
|
||||
self._run_once()
|
||||
except APIConnectionError as err:
|
||||
if self.stopped:
|
||||
break
|
||||
if self._connected:
|
||||
_LOGGER.error("Error while reading incoming messages: %s", err)
|
||||
self._fatal_error()
|
||||
self._running_event.clear()
|
||||
|
||||
def _handle_internal_messages(self, msg):
|
||||
if isinstance(msg, pb.DisconnectRequest):
|
||||
self._send_message(pb.DisconnectResponse())
|
||||
if self._socket is not None:
|
||||
self._socket.close()
|
||||
self._socket = None
|
||||
self._connected = False
|
||||
if self.on_disconnect is not None:
|
||||
self.on_disconnect()
|
||||
elif isinstance(msg, pb.PingRequest):
|
||||
self._send_message(pb.PingResponse())
|
||||
elif isinstance(msg, pb.GetTimeRequest):
|
||||
resp = pb.GetTimeResponse()
|
||||
resp.epoch_seconds = int(time.time())
|
||||
self._send_message(resp)
|
||||
|
||||
|
||||
def run_logs(config, address):
|
||||
conf = config['api']
|
||||
port = conf[CONF_PORT]
|
||||
password = conf[CONF_PASSWORD]
|
||||
_LOGGER.info("Starting log output from %s using esphome API", address)
|
||||
|
||||
cli = APIClient(address, port, password)
|
||||
stopping = False
|
||||
retry_timer = []
|
||||
|
||||
has_connects = []
|
||||
|
||||
def try_connect(tries=0, is_disconnect=True):
|
||||
if stopping:
|
||||
return
|
||||
|
||||
if is_disconnect:
|
||||
_LOGGER.warning(u"Disconnected from API.")
|
||||
|
||||
while retry_timer:
|
||||
retry_timer.pop(0).cancel()
|
||||
|
||||
error = None
|
||||
try:
|
||||
cli.connect()
|
||||
cli.login()
|
||||
except APIConnectionError as err: # noqa
|
||||
error = err
|
||||
|
||||
if error is None:
|
||||
_LOGGER.info("Successfully connected to %s", address)
|
||||
return
|
||||
|
||||
wait_time = min(2**tries, 300)
|
||||
if not has_connects:
|
||||
_LOGGER.warning(u"Initial connection failed. The ESP might not be connected "
|
||||
u"to WiFi yet (%s). Re-Trying in %s seconds",
|
||||
error, wait_time)
|
||||
else:
|
||||
_LOGGER.warning(u"Couldn't connect to API (%s). Trying to reconnect in %s seconds",
|
||||
error, wait_time)
|
||||
timer = threading.Timer(wait_time, functools.partial(try_connect, tries + 1, is_disconnect))
|
||||
timer.start()
|
||||
retry_timer.append(timer)
|
||||
|
||||
def on_log(msg):
|
||||
time_ = datetime.now().time().strftime(u'[%H:%M:%S]')
|
||||
text = msg.message
|
||||
if msg.send_failed:
|
||||
text = color('white', '(Message skipped because it was too big to fit in '
|
||||
'TCP buffer - This is only cosmetic)')
|
||||
safe_print(time_ + text)
|
||||
|
||||
def on_login():
|
||||
try:
|
||||
cli.subscribe_logs(on_log, dump_config=not has_connects)
|
||||
has_connects.append(True)
|
||||
except APIConnectionError:
|
||||
cli.disconnect()
|
||||
|
||||
cli.on_disconnect = try_connect
|
||||
cli.on_login = on_login
|
||||
cli.start()
|
||||
|
||||
try:
|
||||
try_connect(is_disconnect=False)
|
||||
while True:
|
||||
time.sleep(1)
|
||||
except KeyboardInterrupt:
|
||||
stopping = True
|
||||
cli.stop(True)
|
||||
while retry_timer:
|
||||
retry_timer.pop(0).cancel()
|
||||
return 0
|
||||
@@ -1,384 +0,0 @@
|
||||
import copy
|
||||
|
||||
import voluptuous as vol
|
||||
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_ABOVE, CONF_ACTION_ID, CONF_AND, CONF_AUTOMATION_ID, CONF_BELOW, \
|
||||
CONF_CONDITION, CONF_CONDITION_ID, CONF_DELAY, CONF_ELSE, CONF_ID, CONF_IF, CONF_LAMBDA, \
|
||||
CONF_OR, CONF_RANGE, CONF_THEN, CONF_TRIGGER_ID, CONF_WAIT_UNTIL, CONF_WHILE
|
||||
from esphome.core import CORE
|
||||
from esphome.cpp_generator import Pvariable, TemplateArguments, add, get_variable, \
|
||||
process_lambda, templatable
|
||||
from esphome.cpp_types import Action, App, Component, PollingComponent, Trigger, bool_, \
|
||||
esphome_ns, float_, uint32, void
|
||||
from esphome.util import ServiceRegistry
|
||||
|
||||
|
||||
def maybe_simple_id(*validators):
|
||||
validator = vol.All(*validators)
|
||||
|
||||
def validate(value):
|
||||
if isinstance(value, dict):
|
||||
return validator(value)
|
||||
return validator({CONF_ID: value})
|
||||
|
||||
return validate
|
||||
|
||||
|
||||
def validate_recursive_condition(value):
|
||||
is_list = isinstance(value, list)
|
||||
value = cv.ensure_list()(value)[:]
|
||||
for i, item in enumerate(value):
|
||||
path = [i] if is_list else []
|
||||
item = copy.deepcopy(item)
|
||||
if not isinstance(item, dict):
|
||||
raise vol.Invalid(u"Condition must consist of key-value mapping! Got {}".format(item),
|
||||
path)
|
||||
key = next((x for x in item if x != CONF_CONDITION_ID), None)
|
||||
if key is None:
|
||||
raise vol.Invalid(u"Key missing from action! Got {}".format(item), path)
|
||||
if key not in CONDITION_REGISTRY:
|
||||
raise vol.Invalid(u"Unable to find condition with the name '{}', is the "
|
||||
u"component loaded?".format(key), path + [key])
|
||||
item.setdefault(CONF_CONDITION_ID, None)
|
||||
key2 = next((x for x in item if x not in (CONF_CONDITION_ID, key)), None)
|
||||
if key2 is not None:
|
||||
raise vol.Invalid(u"Cannot have two conditions in one item. Key '{}' overrides '{}'! "
|
||||
u"Did you forget to indent the block inside the condition?"
|
||||
u"".format(key, key2), path)
|
||||
validator = CONDITION_REGISTRY[key][0]
|
||||
try:
|
||||
condition = validator(item[key] or {})
|
||||
except vol.Invalid as err:
|
||||
err.prepend(path)
|
||||
raise err
|
||||
value[i] = {
|
||||
CONF_CONDITION_ID: cv.declare_variable_id(Condition)(item[CONF_CONDITION_ID]),
|
||||
key: condition,
|
||||
}
|
||||
return value
|
||||
|
||||
|
||||
def validate_recursive_action(value):
|
||||
is_list = isinstance(value, list)
|
||||
if not is_list:
|
||||
value = [value]
|
||||
for i, item in enumerate(value):
|
||||
path = [i] if is_list else []
|
||||
item = copy.deepcopy(item)
|
||||
if not isinstance(item, dict):
|
||||
raise vol.Invalid(u"Action must consist of key-value mapping! Got {}".format(item),
|
||||
path)
|
||||
key = next((x for x in item if x != CONF_ACTION_ID), None)
|
||||
if key is None:
|
||||
raise vol.Invalid(u"Key missing from action! Got {}".format(item), path)
|
||||
if key not in ACTION_REGISTRY:
|
||||
raise vol.Invalid(u"Unable to find action with the name '{}', is the component loaded?"
|
||||
u"".format(key), path + [key])
|
||||
item.setdefault(CONF_ACTION_ID, None)
|
||||
key2 = next((x for x in item if x not in (CONF_ACTION_ID, key)), None)
|
||||
if key2 is not None:
|
||||
raise vol.Invalid(u"Cannot have two actions in one item. Key '{}' overrides '{}'! "
|
||||
u"Did you forget to indent the block inside the action?"
|
||||
u"".format(key, key2), path)
|
||||
validator = ACTION_REGISTRY[key][0]
|
||||
try:
|
||||
action = validator(item[key] or {})
|
||||
except vol.Invalid as err:
|
||||
err.prepend(path)
|
||||
raise err
|
||||
value[i] = {
|
||||
CONF_ACTION_ID: cv.declare_variable_id(Action)(item[CONF_ACTION_ID]),
|
||||
key: action,
|
||||
}
|
||||
return value
|
||||
|
||||
|
||||
ACTION_REGISTRY = ServiceRegistry()
|
||||
CONDITION_REGISTRY = ServiceRegistry()
|
||||
|
||||
# pylint: disable=invalid-name
|
||||
DelayAction = esphome_ns.class_('DelayAction', Action, Component)
|
||||
LambdaAction = esphome_ns.class_('LambdaAction', Action)
|
||||
IfAction = esphome_ns.class_('IfAction', Action)
|
||||
WhileAction = esphome_ns.class_('WhileAction', Action)
|
||||
WaitUntilAction = esphome_ns.class_('WaitUntilAction', Action, Component)
|
||||
UpdateComponentAction = esphome_ns.class_('UpdateComponentAction', Action)
|
||||
Automation = esphome_ns.class_('Automation')
|
||||
|
||||
Condition = esphome_ns.class_('Condition')
|
||||
AndCondition = esphome_ns.class_('AndCondition', Condition)
|
||||
OrCondition = esphome_ns.class_('OrCondition', Condition)
|
||||
RangeCondition = esphome_ns.class_('RangeCondition', Condition)
|
||||
LambdaCondition = esphome_ns.class_('LambdaCondition', Condition)
|
||||
|
||||
|
||||
def validate_automation(extra_schema=None, extra_validators=None, single=False):
|
||||
if extra_schema is None:
|
||||
extra_schema = {}
|
||||
if isinstance(extra_schema, vol.Schema):
|
||||
extra_schema = extra_schema.schema
|
||||
schema = AUTOMATION_SCHEMA.extend(extra_schema)
|
||||
|
||||
def validator_(value):
|
||||
if isinstance(value, list):
|
||||
try:
|
||||
# First try as a sequence of actions
|
||||
return [schema({CONF_THEN: value})]
|
||||
except vol.Invalid as err:
|
||||
if err.path and err.path[0] == CONF_THEN:
|
||||
err.path.pop(0)
|
||||
|
||||
# Next try as a sequence of automations
|
||||
try:
|
||||
return cv.Schema([schema])(value)
|
||||
except vol.Invalid as err2:
|
||||
if 'Unable to find action' in str(err):
|
||||
raise err2
|
||||
raise vol.MultipleInvalid([err, err2])
|
||||
elif isinstance(value, dict):
|
||||
if CONF_THEN in value:
|
||||
return [schema(value)]
|
||||
return [schema({CONF_THEN: value})]
|
||||
# This should only happen with invalid configs, but let's have a nice error message.
|
||||
return [schema(value)]
|
||||
|
||||
def validator(value):
|
||||
value = validator_(value)
|
||||
if extra_validators is not None:
|
||||
value = cv.Schema([extra_validators])(value)
|
||||
if single:
|
||||
if len(value) != 1:
|
||||
raise vol.Invalid("Cannot have more than 1 automation for templates")
|
||||
return value[0]
|
||||
return value
|
||||
|
||||
return validator
|
||||
|
||||
|
||||
AUTOMATION_SCHEMA = cv.Schema({
|
||||
cv.GenerateID(CONF_TRIGGER_ID): cv.declare_variable_id(Trigger),
|
||||
cv.GenerateID(CONF_AUTOMATION_ID): cv.declare_variable_id(Automation),
|
||||
vol.Required(CONF_THEN): validate_recursive_action,
|
||||
})
|
||||
|
||||
AND_CONDITION_SCHEMA = validate_recursive_condition
|
||||
|
||||
|
||||
@CONDITION_REGISTRY.register(CONF_AND, AND_CONDITION_SCHEMA)
|
||||
def and_condition_to_code(config, condition_id, template_arg, args):
|
||||
for conditions in build_conditions(config, template_arg, args):
|
||||
yield
|
||||
rhs = AndCondition.new(template_arg, conditions)
|
||||
type = AndCondition.template(template_arg)
|
||||
yield Pvariable(condition_id, rhs, type=type)
|
||||
|
||||
|
||||
OR_CONDITION_SCHEMA = validate_recursive_condition
|
||||
|
||||
|
||||
@CONDITION_REGISTRY.register(CONF_OR, OR_CONDITION_SCHEMA)
|
||||
def or_condition_to_code(config, condition_id, template_arg, args):
|
||||
for conditions in build_conditions(config, template_arg, args):
|
||||
yield
|
||||
rhs = OrCondition.new(template_arg, conditions)
|
||||
type = OrCondition.template(template_arg)
|
||||
yield Pvariable(condition_id, rhs, type=type)
|
||||
|
||||
|
||||
RANGE_CONDITION_SCHEMA = vol.All(cv.Schema({
|
||||
vol.Optional(CONF_ABOVE): cv.templatable(cv.float_),
|
||||
vol.Optional(CONF_BELOW): cv.templatable(cv.float_),
|
||||
}), cv.has_at_least_one_key(CONF_ABOVE, CONF_BELOW))
|
||||
|
||||
|
||||
@CONDITION_REGISTRY.register(CONF_RANGE, RANGE_CONDITION_SCHEMA)
|
||||
def range_condition_to_code(config, condition_id, template_arg, args):
|
||||
for conditions in build_conditions(config, template_arg, args):
|
||||
yield
|
||||
rhs = RangeCondition.new(template_arg, conditions)
|
||||
type = RangeCondition.template(template_arg)
|
||||
condition = Pvariable(condition_id, rhs, type=type)
|
||||
if CONF_ABOVE in config:
|
||||
for template_ in templatable(config[CONF_ABOVE], args, float_):
|
||||
yield
|
||||
condition.set_min(template_)
|
||||
if CONF_BELOW in config:
|
||||
for template_ in templatable(config[CONF_BELOW], args, float_):
|
||||
yield
|
||||
condition.set_max(template_)
|
||||
yield condition
|
||||
|
||||
|
||||
DELAY_ACTION_SCHEMA = cv.templatable(cv.positive_time_period_milliseconds)
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_DELAY, DELAY_ACTION_SCHEMA)
|
||||
def delay_action_to_code(config, action_id, template_arg, args):
|
||||
rhs = App.register_component(DelayAction.new(template_arg))
|
||||
type = DelayAction.template(template_arg)
|
||||
action = Pvariable(action_id, rhs, type=type)
|
||||
for template_ in templatable(config, args, uint32):
|
||||
yield
|
||||
add(action.set_delay(template_))
|
||||
yield action
|
||||
|
||||
|
||||
IF_ACTION_SCHEMA = vol.All({
|
||||
vol.Required(CONF_CONDITION): validate_recursive_condition,
|
||||
vol.Optional(CONF_THEN): validate_recursive_action,
|
||||
vol.Optional(CONF_ELSE): validate_recursive_action,
|
||||
}, cv.has_at_least_one_key(CONF_THEN, CONF_ELSE))
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_IF, IF_ACTION_SCHEMA)
|
||||
def if_action_to_code(config, action_id, template_arg, args):
|
||||
for conditions in build_conditions(config[CONF_CONDITION], template_arg, args):
|
||||
yield None
|
||||
rhs = IfAction.new(template_arg, conditions)
|
||||
type = IfAction.template(template_arg)
|
||||
action = Pvariable(action_id, rhs, type=type)
|
||||
if CONF_THEN in config:
|
||||
for actions in build_actions(config[CONF_THEN], template_arg, args):
|
||||
yield None
|
||||
add(action.add_then(actions))
|
||||
if CONF_ELSE in config:
|
||||
for actions in build_actions(config[CONF_ELSE], template_arg, args):
|
||||
yield None
|
||||
add(action.add_else(actions))
|
||||
yield action
|
||||
|
||||
|
||||
WHILE_ACTION_SCHEMA = cv.Schema({
|
||||
vol.Required(CONF_CONDITION): validate_recursive_condition,
|
||||
vol.Required(CONF_THEN): validate_recursive_action,
|
||||
})
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_WHILE, WHILE_ACTION_SCHEMA)
|
||||
def while_action_to_code(config, action_id, template_arg, args):
|
||||
for conditions in build_conditions(config[CONF_CONDITION], template_arg, args):
|
||||
yield None
|
||||
rhs = WhileAction.new(template_arg, conditions)
|
||||
type = WhileAction.template(template_arg)
|
||||
action = Pvariable(action_id, rhs, type=type)
|
||||
for actions in build_actions(config[CONF_THEN], template_arg, args):
|
||||
yield None
|
||||
add(action.add_then(actions))
|
||||
yield action
|
||||
|
||||
|
||||
def validate_wait_until(value):
|
||||
schema = cv.Schema({
|
||||
vol.Required(CONF_CONDITION): validate_recursive_condition
|
||||
})
|
||||
if isinstance(value, dict) and CONF_CONDITION in value:
|
||||
return schema(value)
|
||||
return validate_wait_until({CONF_CONDITION: value})
|
||||
|
||||
|
||||
WAIT_UNTIL_ACTION_SCHEMA = validate_wait_until
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_WAIT_UNTIL, WAIT_UNTIL_ACTION_SCHEMA)
|
||||
def wait_until_action_to_code(config, action_id, template_arg, args):
|
||||
for conditions in build_conditions(config[CONF_CONDITION], template_arg, args):
|
||||
yield None
|
||||
rhs = WaitUntilAction.new(template_arg, conditions)
|
||||
type = WaitUntilAction.template(template_arg)
|
||||
action = Pvariable(action_id, rhs, type=type)
|
||||
add(App.register_component(action))
|
||||
yield action
|
||||
|
||||
|
||||
LAMBDA_ACTION_SCHEMA = cv.lambda_
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_LAMBDA, LAMBDA_ACTION_SCHEMA)
|
||||
def lambda_action_to_code(config, action_id, template_arg, args):
|
||||
for lambda_ in process_lambda(config, args, return_type=void):
|
||||
yield None
|
||||
rhs = LambdaAction.new(template_arg, lambda_)
|
||||
type = LambdaAction.template(template_arg)
|
||||
yield Pvariable(action_id, rhs, type=type)
|
||||
|
||||
|
||||
LAMBDA_CONDITION_SCHEMA = cv.lambda_
|
||||
|
||||
|
||||
@CONDITION_REGISTRY.register(CONF_LAMBDA, LAMBDA_CONDITION_SCHEMA)
|
||||
def lambda_condition_to_code(config, condition_id, template_arg, args):
|
||||
for lambda_ in process_lambda(config, args, return_type=bool_):
|
||||
yield
|
||||
rhs = LambdaCondition.new(template_arg, lambda_)
|
||||
type = LambdaCondition.template(template_arg)
|
||||
yield Pvariable(condition_id, rhs, type=type)
|
||||
|
||||
|
||||
CONF_COMPONENT_UPDATE = 'component.update'
|
||||
COMPONENT_UPDATE_ACTION_SCHEMA = maybe_simple_id({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(PollingComponent),
|
||||
})
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_COMPONENT_UPDATE, COMPONENT_UPDATE_ACTION_SCHEMA)
|
||||
def component_update_action_to_code(config, action_id, template_arg, args):
|
||||
for var in get_variable(config[CONF_ID]):
|
||||
yield None
|
||||
rhs = UpdateComponentAction.new(template_arg, var)
|
||||
type = UpdateComponentAction.template(template_arg)
|
||||
yield Pvariable(action_id, rhs, type=type)
|
||||
|
||||
|
||||
def build_action(full_config, template_arg, args):
|
||||
action_id = full_config[CONF_ACTION_ID]
|
||||
key, config = next((k, v) for k, v in full_config.items() if k in ACTION_REGISTRY)
|
||||
|
||||
builder = ACTION_REGISTRY[key][1]
|
||||
for result in builder(config, action_id, template_arg, args):
|
||||
yield None
|
||||
yield result
|
||||
|
||||
|
||||
def build_actions(config, templ, arg_type):
|
||||
actions = []
|
||||
for conf in config:
|
||||
for action in build_action(conf, templ, arg_type):
|
||||
yield None
|
||||
actions.append(action)
|
||||
yield actions
|
||||
|
||||
|
||||
def build_condition(full_config, template_arg, args):
|
||||
action_id = full_config[CONF_CONDITION_ID]
|
||||
key, config = next((k, v) for k, v in full_config.items() if k in CONDITION_REGISTRY)
|
||||
|
||||
builder = CONDITION_REGISTRY[key][1]
|
||||
for result in builder(config, action_id, template_arg, args):
|
||||
yield None
|
||||
yield result
|
||||
|
||||
|
||||
def build_conditions(config, templ, args):
|
||||
conditions = []
|
||||
for conf in config:
|
||||
for condition in build_condition(conf, templ, args):
|
||||
yield None
|
||||
conditions.append(condition)
|
||||
yield conditions
|
||||
|
||||
|
||||
def build_automation_(trigger, args, config):
|
||||
arg_types = [arg[0] for arg in args]
|
||||
templ = TemplateArguments(*arg_types)
|
||||
rhs = App.make_automation(templ, trigger)
|
||||
type = Automation.template(templ)
|
||||
obj = Pvariable(config[CONF_AUTOMATION_ID], rhs, type=type)
|
||||
for actions in build_actions(config[CONF_THEN], templ, args):
|
||||
yield None
|
||||
add(obj.add_actions(actions))
|
||||
yield obj
|
||||
|
||||
|
||||
def build_automations(trigger, args, config):
|
||||
CORE.add_job(build_automation_, trigger, args, config)
|
||||
@@ -1,27 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome.components import i2c, sensor
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_ADDRESS, CONF_ID
|
||||
from esphome.cpp_generator import Pvariable
|
||||
from esphome.cpp_helpers import setup_component
|
||||
from esphome.cpp_types import App, Component
|
||||
|
||||
DEPENDENCIES = ['i2c']
|
||||
MULTI_CONF = True
|
||||
|
||||
ADS1115Component = sensor.sensor_ns.class_('ADS1115Component', Component, i2c.I2CDevice)
|
||||
|
||||
CONFIG_SCHEMA = cv.Schema({
|
||||
cv.GenerateID(): cv.declare_variable_id(ADS1115Component),
|
||||
vol.Required(CONF_ADDRESS): cv.i2c_address,
|
||||
}).extend(cv.COMPONENT_SCHEMA.schema)
|
||||
|
||||
|
||||
def to_code(config):
|
||||
rhs = App.make_ads1115_component(config[CONF_ADDRESS])
|
||||
var = Pvariable(config[CONF_ID], rhs)
|
||||
setup_component(var, config)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_ADS1115_SENSOR'
|
||||
@@ -1,33 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome.components import i2c, sensor
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_ADDRESS, CONF_ID, CONF_UPDATE_INTERVAL
|
||||
from esphome.cpp_generator import Pvariable, add
|
||||
from esphome.cpp_helpers import setup_component
|
||||
from esphome.cpp_types import App, PollingComponent
|
||||
|
||||
DEPENDENCIES = ['i2c']
|
||||
MULTI_CONF = True
|
||||
|
||||
CONF_APDS9960_ID = 'apds9960_id'
|
||||
APDS9960 = sensor.sensor_ns.class_('APDS9960', PollingComponent, i2c.I2CDevice)
|
||||
|
||||
CONFIG_SCHEMA = cv.Schema({
|
||||
cv.GenerateID(): cv.declare_variable_id(APDS9960),
|
||||
vol.Optional(CONF_ADDRESS): cv.i2c_address,
|
||||
vol.Optional(CONF_UPDATE_INTERVAL): cv.update_interval,
|
||||
}).extend(cv.COMPONENT_SCHEMA.schema)
|
||||
|
||||
|
||||
def to_code(config):
|
||||
rhs = App.make_apds9960(config.get(CONF_UPDATE_INTERVAL))
|
||||
var = Pvariable(config[CONF_ID], rhs)
|
||||
|
||||
if CONF_ADDRESS in config:
|
||||
add(var.set_address(config[CONF_ADDRESS]))
|
||||
|
||||
setup_component(var, config)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_APDS9960'
|
||||
@@ -1,141 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome import automation
|
||||
from esphome.automation import ACTION_REGISTRY, CONDITION_REGISTRY, Condition
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_DATA, CONF_DATA_TEMPLATE, CONF_ID, CONF_PASSWORD, CONF_PORT, \
|
||||
CONF_REBOOT_TIMEOUT, CONF_SERVICE, CONF_VARIABLES, CONF_SERVICES, CONF_TRIGGER_ID
|
||||
from esphome.core import CORE
|
||||
from esphome.cpp_generator import Pvariable, add, get_variable, process_lambda
|
||||
from esphome.cpp_helpers import setup_component
|
||||
from esphome.cpp_types import Action, App, Component, StoringController, esphome_ns, Trigger, \
|
||||
bool_, int32, float_, std_string
|
||||
|
||||
api_ns = esphome_ns.namespace('api')
|
||||
APIServer = api_ns.class_('APIServer', Component, StoringController)
|
||||
HomeAssistantServiceCallAction = api_ns.class_('HomeAssistantServiceCallAction', Action)
|
||||
KeyValuePair = api_ns.class_('KeyValuePair')
|
||||
TemplatableKeyValuePair = api_ns.class_('TemplatableKeyValuePair')
|
||||
APIConnectedCondition = api_ns.class_('APIConnectedCondition', Condition)
|
||||
|
||||
UserService = api_ns.class_('UserService', Trigger)
|
||||
ServiceTypeArgument = api_ns.class_('ServiceTypeArgument')
|
||||
ServiceArgType = api_ns.enum('ServiceArgType')
|
||||
SERVICE_ARG_TYPES = {
|
||||
'bool': ServiceArgType.SERVICE_ARG_TYPE_BOOL,
|
||||
'int': ServiceArgType.SERVICE_ARG_TYPE_INT,
|
||||
'float': ServiceArgType.SERVICE_ARG_TYPE_FLOAT,
|
||||
'string': ServiceArgType.SERVICE_ARG_TYPE_STRING,
|
||||
}
|
||||
SERVICE_ARG_NATIVE_TYPES = {
|
||||
'bool': bool_,
|
||||
'int': int32,
|
||||
'float': float_,
|
||||
'string': std_string,
|
||||
}
|
||||
|
||||
|
||||
CONFIG_SCHEMA = cv.Schema({
|
||||
cv.GenerateID(): cv.declare_variable_id(APIServer),
|
||||
vol.Optional(CONF_PORT, default=6053): cv.port,
|
||||
vol.Optional(CONF_PASSWORD, default=''): cv.string_strict,
|
||||
vol.Optional(CONF_REBOOT_TIMEOUT): cv.positive_time_period_milliseconds,
|
||||
vol.Optional(CONF_SERVICES): automation.validate_automation({
|
||||
cv.GenerateID(CONF_TRIGGER_ID): cv.declare_variable_id(UserService),
|
||||
vol.Required(CONF_SERVICE): cv.valid_name,
|
||||
vol.Optional(CONF_VARIABLES, default={}): cv.Schema({
|
||||
cv.validate_id_name: cv.one_of(*SERVICE_ARG_TYPES, lower=True),
|
||||
}),
|
||||
}),
|
||||
}).extend(cv.COMPONENT_SCHEMA.schema)
|
||||
|
||||
|
||||
def to_code(config):
|
||||
rhs = App.init_api_server()
|
||||
api = Pvariable(config[CONF_ID], rhs)
|
||||
|
||||
if config[CONF_PORT] != 6053:
|
||||
add(api.set_port(config[CONF_PORT]))
|
||||
if config.get(CONF_PASSWORD):
|
||||
add(api.set_password(config[CONF_PASSWORD]))
|
||||
if CONF_REBOOT_TIMEOUT in config:
|
||||
add(api.set_reboot_timeout(config[CONF_REBOOT_TIMEOUT]))
|
||||
|
||||
for conf in config.get(CONF_SERVICES, []):
|
||||
template_args = []
|
||||
func_args = []
|
||||
service_type_args = []
|
||||
for name, var_ in conf[CONF_VARIABLES].items():
|
||||
native = SERVICE_ARG_NATIVE_TYPES[var_]
|
||||
template_args.append(native)
|
||||
func_args.append((native, name))
|
||||
service_type_args.append(ServiceTypeArgument(name, SERVICE_ARG_TYPES[var_]))
|
||||
func = api.make_user_service_trigger.template(*template_args)
|
||||
rhs = func(conf[CONF_SERVICE], service_type_args)
|
||||
type_ = UserService.template(*template_args)
|
||||
trigger = Pvariable(conf[CONF_TRIGGER_ID], rhs, type=type_)
|
||||
automation.build_automations(trigger, func_args, conf)
|
||||
|
||||
setup_component(api, config)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_API'
|
||||
|
||||
|
||||
def lib_deps(config):
|
||||
if CORE.is_esp32:
|
||||
return 'AsyncTCP@1.0.3'
|
||||
if CORE.is_esp8266:
|
||||
return 'ESPAsyncTCP@1.1.3'
|
||||
raise NotImplementedError
|
||||
|
||||
|
||||
CONF_HOMEASSISTANT_SERVICE = 'homeassistant.service'
|
||||
HOMEASSISTANT_SERVIC_ACTION_SCHEMA = cv.Schema({
|
||||
cv.GenerateID(): cv.use_variable_id(APIServer),
|
||||
vol.Required(CONF_SERVICE): cv.string,
|
||||
vol.Optional(CONF_DATA): cv.Schema({
|
||||
cv.string: cv.string,
|
||||
}),
|
||||
vol.Optional(CONF_DATA_TEMPLATE): cv.Schema({
|
||||
cv.string: cv.string,
|
||||
}),
|
||||
vol.Optional(CONF_VARIABLES): cv.Schema({
|
||||
cv.string: cv.lambda_,
|
||||
}),
|
||||
})
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_HOMEASSISTANT_SERVICE, HOMEASSISTANT_SERVIC_ACTION_SCHEMA)
|
||||
def homeassistant_service_to_code(config, action_id, template_arg, args):
|
||||
for var in get_variable(config[CONF_ID]):
|
||||
yield None
|
||||
rhs = var.make_home_assistant_service_call_action(template_arg)
|
||||
type = HomeAssistantServiceCallAction.template(template_arg)
|
||||
act = Pvariable(action_id, rhs, type=type)
|
||||
add(act.set_service(config[CONF_SERVICE]))
|
||||
if CONF_DATA in config:
|
||||
datas = [KeyValuePair(k, v) for k, v in config[CONF_DATA].items()]
|
||||
add(act.set_data(datas))
|
||||
if CONF_DATA_TEMPLATE in config:
|
||||
datas = [KeyValuePair(k, v) for k, v in config[CONF_DATA_TEMPLATE].items()]
|
||||
add(act.set_data_template(datas))
|
||||
if CONF_VARIABLES in config:
|
||||
datas = []
|
||||
for key, value in config[CONF_VARIABLES].items():
|
||||
for value_ in process_lambda(value, []):
|
||||
yield None
|
||||
datas.append(TemplatableKeyValuePair(key, value_))
|
||||
add(act.set_variables(datas))
|
||||
yield act
|
||||
|
||||
|
||||
CONF_API_CONNECTED = 'api.connected'
|
||||
API_CONNECTED_CONDITION_SCHEMA = cv.Schema({})
|
||||
|
||||
|
||||
@CONDITION_REGISTRY.register(CONF_API_CONNECTED, API_CONNECTED_CONDITION_SCHEMA)
|
||||
def api_connected_to_code(config, condition_id, template_arg, args):
|
||||
rhs = APIConnectedCondition.new(template_arg)
|
||||
type = APIConnectedCondition.template(template_arg)
|
||||
yield Pvariable(condition_id, rhs, type=type)
|
||||
@@ -1,36 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome.components import binary_sensor, sensor
|
||||
from esphome.components.apds9960 import APDS9960, CONF_APDS9960_ID
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_DIRECTION, CONF_NAME
|
||||
from esphome.cpp_generator import get_variable
|
||||
|
||||
DEPENDENCIES = ['apds9960']
|
||||
APDS9960GestureDirectionBinarySensor = sensor.sensor_ns.class_(
|
||||
'APDS9960GestureDirectionBinarySensor', binary_sensor.BinarySensor)
|
||||
|
||||
DIRECTIONS = {
|
||||
'UP': 'make_up_direction',
|
||||
'DOWN': 'make_down_direction',
|
||||
'LEFT': 'make_left_direction',
|
||||
'RIGHT': 'make_right_direction',
|
||||
}
|
||||
|
||||
PLATFORM_SCHEMA = cv.nameable(binary_sensor.BINARY_SENSOR_PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(APDS9960GestureDirectionBinarySensor),
|
||||
vol.Required(CONF_DIRECTION): cv.one_of(*DIRECTIONS, upper=True),
|
||||
cv.GenerateID(CONF_APDS9960_ID): cv.use_variable_id(APDS9960)
|
||||
}))
|
||||
|
||||
|
||||
def to_code(config):
|
||||
for hub in get_variable(config[CONF_APDS9960_ID]):
|
||||
yield
|
||||
func = getattr(hub, DIRECTIONS[config[CONF_DIRECTION]])
|
||||
rhs = func(config[CONF_NAME])
|
||||
binary_sensor.register_binary_sensor(rhs, config)
|
||||
|
||||
|
||||
def to_hass_config(data, config):
|
||||
return binary_sensor.core_to_hass_config(data, config)
|
||||
@@ -1,39 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome.components import binary_sensor
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_BINARY_SENSORS, CONF_ID, CONF_LAMBDA, CONF_NAME
|
||||
from esphome.cpp_generator import add, process_lambda, variable
|
||||
from esphome.cpp_types import std_vector
|
||||
|
||||
CustomBinarySensorConstructor = binary_sensor.binary_sensor_ns.class_(
|
||||
'CustomBinarySensorConstructor')
|
||||
|
||||
PLATFORM_SCHEMA = binary_sensor.PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(CustomBinarySensorConstructor),
|
||||
vol.Required(CONF_LAMBDA): cv.lambda_,
|
||||
vol.Required(CONF_BINARY_SENSORS):
|
||||
cv.ensure_list(binary_sensor.BINARY_SENSOR_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(binary_sensor.BinarySensor),
|
||||
})),
|
||||
})
|
||||
|
||||
|
||||
def to_code(config):
|
||||
for template_ in process_lambda(config[CONF_LAMBDA], [],
|
||||
return_type=std_vector.template(binary_sensor.BinarySensorPtr)):
|
||||
yield
|
||||
|
||||
rhs = CustomBinarySensorConstructor(template_)
|
||||
custom = variable(config[CONF_ID], rhs)
|
||||
for i, conf in enumerate(config[CONF_BINARY_SENSORS]):
|
||||
rhs = custom.Pget_binary_sensor(i)
|
||||
add(rhs.set_name(conf[CONF_NAME]))
|
||||
binary_sensor.register_binary_sensor(rhs, conf)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_CUSTOM_BINARY_SENSOR'
|
||||
|
||||
|
||||
def to_hass_config(data, config):
|
||||
return [binary_sensor.core_to_hass_config(data, sens) for sens in config[CONF_BINARY_SENSORS]]
|
||||
@@ -1,30 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome.components import binary_sensor
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_ENTITY_ID, CONF_ID, CONF_NAME
|
||||
from esphome.cpp_generator import Pvariable
|
||||
from esphome.cpp_types import App
|
||||
|
||||
DEPENDENCIES = ['api']
|
||||
|
||||
HomeassistantBinarySensor = binary_sensor.binary_sensor_ns.class_('HomeassistantBinarySensor',
|
||||
binary_sensor.BinarySensor)
|
||||
|
||||
PLATFORM_SCHEMA = cv.nameable(binary_sensor.BINARY_SENSOR_PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(HomeassistantBinarySensor),
|
||||
vol.Required(CONF_ENTITY_ID): cv.entity_id,
|
||||
}))
|
||||
|
||||
|
||||
def to_code(config):
|
||||
rhs = App.make_homeassistant_binary_sensor(config[CONF_NAME], config[CONF_ENTITY_ID])
|
||||
subs = Pvariable(config[CONF_ID], rhs)
|
||||
binary_sensor.setup_binary_sensor(subs, config)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_HOMEASSISTANT_BINARY_SENSOR'
|
||||
|
||||
|
||||
def to_hass_config(data, config):
|
||||
return binary_sensor.core_to_hass_config(data, config)
|
||||
@@ -1,28 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome.components import binary_sensor
|
||||
from esphome.components.mpr121 import MPR121Component, CONF_MPR121_ID
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_CHANNEL, CONF_NAME
|
||||
from esphome.cpp_generator import get_variable
|
||||
|
||||
DEPENDENCIES = ['mpr121']
|
||||
MPR121Channel = binary_sensor.binary_sensor_ns.class_(
|
||||
'MPR121Channel', binary_sensor.BinarySensor)
|
||||
|
||||
PLATFORM_SCHEMA = cv.nameable(binary_sensor.BINARY_SENSOR_PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(MPR121Channel),
|
||||
cv.GenerateID(CONF_MPR121_ID): cv.use_variable_id(MPR121Component),
|
||||
vol.Required(CONF_CHANNEL): vol.All(vol.Coerce(int), vol.Range(min=0, max=11))
|
||||
}))
|
||||
|
||||
|
||||
def to_code(config):
|
||||
for hub in get_variable(config[CONF_MPR121_ID]):
|
||||
yield
|
||||
rhs = MPR121Channel.new(config[CONF_NAME], config[CONF_CHANNEL])
|
||||
binary_sensor.register_binary_sensor(hub.add_channel(rhs), config)
|
||||
|
||||
|
||||
def to_hass_config(data, config):
|
||||
return binary_sensor.core_to_hass_config(data, config)
|
||||
@@ -1,60 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome.automation import ACTION_REGISTRY
|
||||
from esphome.components import binary_sensor
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_ID, CONF_LAMBDA, CONF_NAME, CONF_STATE
|
||||
from esphome.cpp_generator import Pvariable, add, get_variable, process_lambda, templatable
|
||||
from esphome.cpp_helpers import setup_component
|
||||
from esphome.cpp_types import Action, App, Component, bool_, optional
|
||||
|
||||
TemplateBinarySensor = binary_sensor.binary_sensor_ns.class_('TemplateBinarySensor',
|
||||
binary_sensor.BinarySensor,
|
||||
Component)
|
||||
BinarySensorPublishAction = binary_sensor.binary_sensor_ns.class_('BinarySensorPublishAction',
|
||||
Action)
|
||||
|
||||
PLATFORM_SCHEMA = cv.nameable(binary_sensor.BINARY_SENSOR_PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(TemplateBinarySensor),
|
||||
vol.Optional(CONF_LAMBDA): cv.lambda_,
|
||||
}).extend(cv.COMPONENT_SCHEMA.schema))
|
||||
|
||||
|
||||
def to_code(config):
|
||||
rhs = App.make_template_binary_sensor(config[CONF_NAME])
|
||||
var = Pvariable(config[CONF_ID], rhs)
|
||||
binary_sensor.setup_binary_sensor(var, config)
|
||||
setup_component(var, config)
|
||||
|
||||
if CONF_LAMBDA in config:
|
||||
for template_ in process_lambda(config[CONF_LAMBDA], [],
|
||||
return_type=optional.template(bool_)):
|
||||
yield
|
||||
add(var.set_template(template_))
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_TEMPLATE_BINARY_SENSOR'
|
||||
|
||||
CONF_BINARY_SENSOR_TEMPLATE_PUBLISH = 'binary_sensor.template.publish'
|
||||
BINARY_SENSOR_TEMPLATE_PUBLISH_ACTION_SCHEMA = cv.Schema({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(binary_sensor.BinarySensor),
|
||||
vol.Required(CONF_STATE): cv.templatable(cv.boolean),
|
||||
})
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_BINARY_SENSOR_TEMPLATE_PUBLISH,
|
||||
BINARY_SENSOR_TEMPLATE_PUBLISH_ACTION_SCHEMA)
|
||||
def binary_sensor_template_publish_to_code(config, action_id, template_arg, args):
|
||||
for var in get_variable(config[CONF_ID]):
|
||||
yield None
|
||||
rhs = var.make_binary_sensor_publish_action(template_arg)
|
||||
type = BinarySensorPublishAction.template(template_arg)
|
||||
action = Pvariable(action_id, rhs, type=type)
|
||||
for template_ in templatable(config[CONF_STATE], args, bool_):
|
||||
yield None
|
||||
add(action.set_state(template_))
|
||||
yield action
|
||||
|
||||
|
||||
def to_hass_config(data, config):
|
||||
return binary_sensor.core_to_hass_config(data, config)
|
||||
@@ -1,88 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome import automation
|
||||
from esphome.automation import ACTION_REGISTRY
|
||||
from esphome.components import cover
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_ASSUMED_STATE, CONF_CLOSE_ACTION, CONF_ID, CONF_LAMBDA, CONF_NAME, \
|
||||
CONF_OPEN_ACTION, CONF_OPTIMISTIC, CONF_STATE, CONF_STOP_ACTION
|
||||
from esphome.cpp_generator import Pvariable, add, get_variable, process_lambda, templatable
|
||||
from esphome.cpp_helpers import setup_component
|
||||
from esphome.cpp_types import Action, App, optional
|
||||
from esphome.py_compat import string_types
|
||||
|
||||
TemplateCover = cover.cover_ns.class_('TemplateCover', cover.Cover)
|
||||
CoverPublishAction = cover.cover_ns.class_('CoverPublishAction', Action)
|
||||
|
||||
PLATFORM_SCHEMA = cv.nameable(cover.COVER_PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(TemplateCover),
|
||||
vol.Optional(CONF_LAMBDA): cv.lambda_,
|
||||
vol.Optional(CONF_OPTIMISTIC): cv.boolean,
|
||||
vol.Optional(CONF_ASSUMED_STATE): cv.boolean,
|
||||
vol.Optional(CONF_OPEN_ACTION): automation.validate_automation(single=True),
|
||||
vol.Optional(CONF_CLOSE_ACTION): automation.validate_automation(single=True),
|
||||
vol.Optional(CONF_STOP_ACTION): automation.validate_automation(single=True),
|
||||
}).extend(cv.COMPONENT_SCHEMA.schema))
|
||||
|
||||
|
||||
def to_code(config):
|
||||
rhs = App.make_template_cover(config[CONF_NAME])
|
||||
var = Pvariable(config[CONF_ID], rhs)
|
||||
|
||||
cover.setup_cover(var, config)
|
||||
setup_component(var, config)
|
||||
|
||||
if CONF_LAMBDA in config:
|
||||
for template_ in process_lambda(config[CONF_LAMBDA], [],
|
||||
return_type=optional.template(cover.CoverState)):
|
||||
yield
|
||||
add(var.set_state_lambda(template_))
|
||||
if CONF_OPEN_ACTION in config:
|
||||
automation.build_automations(var.get_open_trigger(), [],
|
||||
config[CONF_OPEN_ACTION])
|
||||
if CONF_CLOSE_ACTION in config:
|
||||
automation.build_automations(var.get_close_trigger(), [],
|
||||
config[CONF_CLOSE_ACTION])
|
||||
if CONF_STOP_ACTION in config:
|
||||
automation.build_automations(var.get_stop_trigger(), [],
|
||||
config[CONF_STOP_ACTION])
|
||||
if CONF_OPTIMISTIC in config:
|
||||
add(var.set_optimistic(config[CONF_OPTIMISTIC]))
|
||||
if CONF_ASSUMED_STATE in config:
|
||||
add(var.set_assumed_state(config[CONF_ASSUMED_STATE]))
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_TEMPLATE_COVER'
|
||||
|
||||
CONF_COVER_TEMPLATE_PUBLISH = 'cover.template.publish'
|
||||
COVER_TEMPLATE_PUBLISH_ACTION_SCHEMA = cv.Schema({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(cover.Cover),
|
||||
vol.Required(CONF_STATE): cv.templatable(cover.validate_cover_state),
|
||||
})
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_COVER_TEMPLATE_PUBLISH,
|
||||
COVER_TEMPLATE_PUBLISH_ACTION_SCHEMA)
|
||||
def cover_template_publish_to_code(config, action_id, template_arg, args):
|
||||
for var in get_variable(config[CONF_ID]):
|
||||
yield None
|
||||
rhs = var.make_cover_publish_action(template_arg)
|
||||
type = CoverPublishAction.template(template_arg)
|
||||
action = Pvariable(action_id, rhs, type=type)
|
||||
state = config[CONF_STATE]
|
||||
if isinstance(state, string_types):
|
||||
template_ = cover.COVER_STATES[state]
|
||||
else:
|
||||
for template_ in templatable(state, args, cover.CoverState):
|
||||
yield None
|
||||
add(action.set_state(template_))
|
||||
yield action
|
||||
|
||||
|
||||
def to_hass_config(data, config):
|
||||
ret = cover.core_to_hass_config(data, config)
|
||||
if ret is None:
|
||||
return None
|
||||
if CONF_OPTIMISTIC in config:
|
||||
ret['optimistic'] = config[CONF_OPTIMISTIC]
|
||||
return ret
|
||||
@@ -1,33 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_COMPONENTS, CONF_ID, CONF_LAMBDA
|
||||
from esphome.cpp_generator import Pvariable, process_lambda, variable
|
||||
from esphome.cpp_helpers import setup_component
|
||||
from esphome.cpp_types import Component, ComponentPtr, esphome_ns, std_vector
|
||||
|
||||
CustomComponentConstructor = esphome_ns.class_('CustomComponentConstructor')
|
||||
MULTI_CONF = True
|
||||
|
||||
CONFIG_SCHEMA = cv.Schema({
|
||||
cv.GenerateID(): cv.declare_variable_id(CustomComponentConstructor),
|
||||
vol.Required(CONF_LAMBDA): cv.lambda_,
|
||||
vol.Optional(CONF_COMPONENTS): cv.ensure_list(cv.Schema({
|
||||
cv.GenerateID(): cv.declare_variable_id(Component)
|
||||
}).extend(cv.COMPONENT_SCHEMA.schema)),
|
||||
})
|
||||
|
||||
|
||||
def to_code(config):
|
||||
for template_ in process_lambda(config[CONF_LAMBDA], [],
|
||||
return_type=std_vector.template(ComponentPtr)):
|
||||
yield
|
||||
|
||||
rhs = CustomComponentConstructor(template_)
|
||||
custom = variable(config[CONF_ID], rhs)
|
||||
for i, comp_config in enumerate(config.get(CONF_COMPONENTS, [])):
|
||||
comp = Pvariable(comp_config[CONF_ID], custom.get_component(i))
|
||||
setup_component(comp, comp_config)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_CUSTOM_COMPONENT'
|
||||
@@ -1,27 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome import pins
|
||||
from esphome.components import sensor
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_ID, CONF_PIN, CONF_UPDATE_INTERVAL
|
||||
from esphome.cpp_generator import Pvariable
|
||||
from esphome.cpp_helpers import setup_component
|
||||
from esphome.cpp_types import App, PollingComponent
|
||||
|
||||
DallasComponent = sensor.sensor_ns.class_('DallasComponent', PollingComponent)
|
||||
MULTI_CONF = True
|
||||
|
||||
CONFIG_SCHEMA = cv.Schema({
|
||||
cv.GenerateID(): cv.declare_variable_id(DallasComponent),
|
||||
vol.Required(CONF_PIN): pins.input_pin,
|
||||
vol.Optional(CONF_UPDATE_INTERVAL): cv.update_interval,
|
||||
}).extend(cv.COMPONENT_SCHEMA.schema)
|
||||
|
||||
|
||||
def to_code(config):
|
||||
rhs = App.make_dallas_component(config[CONF_PIN], config.get(CONF_UPDATE_INTERVAL))
|
||||
var = Pvariable(config[CONF_ID], rhs)
|
||||
setup_component(var, config)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_DALLAS_SENSOR'
|
||||
@@ -1,14 +0,0 @@
|
||||
import esphome.config_validation as cv
|
||||
from esphome.cpp_generator import add
|
||||
from esphome.cpp_types import App
|
||||
|
||||
DEPENDENCIES = ['logger']
|
||||
|
||||
CONFIG_SCHEMA = cv.Schema({})
|
||||
|
||||
|
||||
def to_code(config):
|
||||
add(App.make_debug_component())
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_DEBUG_COMPONENT'
|
||||
@@ -1,127 +0,0 @@
|
||||
# coding=utf-8
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome import core
|
||||
from esphome.automation import ACTION_REGISTRY, maybe_simple_id
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_LAMBDA, CONF_ROTATION, CONF_UPDATE_INTERVAL, CONF_PAGES, CONF_ID
|
||||
from esphome.core import CORE
|
||||
from esphome.cpp_generator import add, process_lambda, Pvariable, templatable, get_variable
|
||||
from esphome.cpp_types import esphome_ns, void, Action
|
||||
|
||||
PLATFORM_SCHEMA = cv.PLATFORM_SCHEMA.extend({
|
||||
|
||||
})
|
||||
|
||||
display_ns = esphome_ns.namespace('display')
|
||||
DisplayBuffer = display_ns.class_('DisplayBuffer')
|
||||
DisplayPage = display_ns.class_('DisplayPage')
|
||||
DisplayPagePtr = DisplayPage.operator('ptr')
|
||||
DisplayBufferRef = DisplayBuffer.operator('ref')
|
||||
DisplayPageShowAction = display_ns.class_('DisplayPageShowAction', Action)
|
||||
DisplayPageShowNextAction = display_ns.class_('DisplayPageShowNextAction', Action)
|
||||
DisplayPageShowPrevAction = display_ns.class_('DisplayPageShowPrevAction', Action)
|
||||
|
||||
DISPLAY_ROTATIONS = {
|
||||
0: display_ns.DISPLAY_ROTATION_0_DEGREES,
|
||||
90: display_ns.DISPLAY_ROTATION_90_DEGREES,
|
||||
180: display_ns.DISPLAY_ROTATION_180_DEGREES,
|
||||
270: display_ns.DISPLAY_ROTATION_270_DEGREES,
|
||||
}
|
||||
|
||||
|
||||
def validate_rotation(value):
|
||||
value = cv.string(value)
|
||||
if value.endswith(u"°"):
|
||||
value = value[:-1]
|
||||
try:
|
||||
value = int(value)
|
||||
except ValueError:
|
||||
raise vol.Invalid(u"Expected integer for rotation")
|
||||
return cv.one_of(*DISPLAY_ROTATIONS)(value)
|
||||
|
||||
|
||||
BASIC_DISPLAY_PLATFORM_SCHEMA = PLATFORM_SCHEMA.extend({
|
||||
vol.Optional(CONF_UPDATE_INTERVAL): cv.update_interval,
|
||||
vol.Optional(CONF_LAMBDA): cv.lambda_,
|
||||
})
|
||||
|
||||
FULL_DISPLAY_PLATFORM_SCHEMA = BASIC_DISPLAY_PLATFORM_SCHEMA.extend({
|
||||
vol.Optional(CONF_ROTATION): validate_rotation,
|
||||
vol.Optional(CONF_PAGES): vol.All(cv.ensure_list({
|
||||
cv.GenerateID(): cv.declare_variable_id(DisplayPage),
|
||||
vol.Required(CONF_LAMBDA): cv.lambda_,
|
||||
}), vol.Length(min=1)),
|
||||
})
|
||||
|
||||
|
||||
def setup_display_core_(display_var, config):
|
||||
if CONF_UPDATE_INTERVAL in config:
|
||||
add(display_var.set_update_interval(config[CONF_UPDATE_INTERVAL]))
|
||||
if CONF_ROTATION in config:
|
||||
add(display_var.set_rotation(DISPLAY_ROTATIONS[config[CONF_ROTATION]]))
|
||||
if CONF_PAGES in config:
|
||||
pages = []
|
||||
for conf in config[CONF_PAGES]:
|
||||
for lambda_ in process_lambda(conf[CONF_LAMBDA], [(DisplayBufferRef, 'it')],
|
||||
return_type=void):
|
||||
yield
|
||||
var = Pvariable(conf[CONF_ID], DisplayPage.new(lambda_))
|
||||
pages.append(var)
|
||||
add(display_var.set_pages(pages))
|
||||
|
||||
|
||||
CONF_DISPLAY_PAGE_SHOW = 'display.page.show'
|
||||
DISPLAY_PAGE_SHOW_ACTION_SCHEMA = maybe_simple_id({
|
||||
vol.Required(CONF_ID): cv.templatable(cv.use_variable_id(DisplayPage)),
|
||||
})
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_DISPLAY_PAGE_SHOW, DISPLAY_PAGE_SHOW_ACTION_SCHEMA)
|
||||
def display_page_show_to_code(config, action_id, template_arg, args):
|
||||
type = DisplayPageShowAction.template(template_arg)
|
||||
action = Pvariable(action_id, type.new(), type=type)
|
||||
if isinstance(config[CONF_ID], core.Lambda):
|
||||
for template_ in templatable(config[CONF_ID], args, DisplayPagePtr):
|
||||
yield None
|
||||
add(action.set_page(template_))
|
||||
else:
|
||||
for var in get_variable(config[CONF_ID]):
|
||||
yield None
|
||||
add(action.set_page(var))
|
||||
yield action
|
||||
|
||||
|
||||
CONF_DISPLAY_PAGE_SHOW_NEXT = 'display.page.show_next'
|
||||
DISPLAY_PAGE_SHOW_NEXT_ACTION_SCHEMA = maybe_simple_id({
|
||||
vol.Required(CONF_ID): cv.templatable(cv.use_variable_id(DisplayBuffer)),
|
||||
})
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_DISPLAY_PAGE_SHOW_NEXT, DISPLAY_PAGE_SHOW_NEXT_ACTION_SCHEMA)
|
||||
def display_page_show_next_to_code(config, action_id, template_arg, args):
|
||||
for var in get_variable(config[CONF_ID]):
|
||||
yield None
|
||||
type = DisplayPageShowNextAction.template(template_arg)
|
||||
yield Pvariable(action_id, type.new(var), type=type)
|
||||
|
||||
|
||||
CONF_DISPLAY_PAGE_SHOW_PREVIOUS = 'display.page.show_previous'
|
||||
DISPLAY_PAGE_SHOW_PREVIOUS_ACTION_SCHEMA = maybe_simple_id({
|
||||
vol.Required(CONF_ID): cv.templatable(cv.use_variable_id(DisplayBuffer)),
|
||||
})
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_DISPLAY_PAGE_SHOW_PREVIOUS, DISPLAY_PAGE_SHOW_PREVIOUS_ACTION_SCHEMA)
|
||||
def display_page_show_previous_to_code(config, action_id, template_arg, args):
|
||||
for var in get_variable(config[CONF_ID]):
|
||||
yield None
|
||||
type = DisplayPageShowPrevAction.template(template_arg)
|
||||
yield Pvariable(action_id, type.new(var), type=type)
|
||||
|
||||
|
||||
def setup_display(display_var, config):
|
||||
CORE.add_job(setup_display_core_, display_var, config)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_DISPLAY'
|
||||
@@ -1,40 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome import config_validation as cv
|
||||
from esphome.components import sensor
|
||||
from esphome.const import CONF_ID, CONF_SCAN_INTERVAL, ESP_PLATFORM_ESP32
|
||||
from esphome.core import HexInt
|
||||
from esphome.cpp_generator import Pvariable, add
|
||||
from esphome.cpp_helpers import setup_component
|
||||
from esphome.cpp_types import App, Component, esphome_ns
|
||||
|
||||
ESP_PLATFORMS = [ESP_PLATFORM_ESP32]
|
||||
|
||||
CONF_ESP32_BLE_ID = 'esp32_ble_id'
|
||||
ESP32BLETracker = esphome_ns.class_('ESP32BLETracker', Component)
|
||||
XiaomiSensor = esphome_ns.class_('XiaomiSensor', sensor.Sensor)
|
||||
XiaomiDevice = esphome_ns.class_('XiaomiDevice')
|
||||
XIAOMI_SENSOR_SCHEMA = sensor.SENSOR_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(XiaomiSensor)
|
||||
})
|
||||
|
||||
CONFIG_SCHEMA = cv.Schema({
|
||||
cv.GenerateID(): cv.declare_variable_id(ESP32BLETracker),
|
||||
vol.Optional(CONF_SCAN_INTERVAL): cv.positive_time_period_seconds,
|
||||
}).extend(cv.COMPONENT_SCHEMA.schema)
|
||||
|
||||
|
||||
def make_address_array(address):
|
||||
return [HexInt(i) for i in address.parts]
|
||||
|
||||
|
||||
def to_code(config):
|
||||
rhs = App.make_esp32_ble_tracker()
|
||||
ble = Pvariable(config[CONF_ID], rhs)
|
||||
if CONF_SCAN_INTERVAL in config:
|
||||
add(ble.set_scan_interval(config[CONF_SCAN_INTERVAL]))
|
||||
|
||||
setup_component(ble, config)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_ESP32_BLE_TRACKER'
|
||||
@@ -1,130 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome import config_validation as cv, pins
|
||||
from esphome.const import CONF_FREQUENCY, CONF_ID, CONF_NAME, CONF_PIN, CONF_SCL, CONF_SDA, \
|
||||
ESP_PLATFORM_ESP32
|
||||
from esphome.cpp_generator import Pvariable, add
|
||||
from esphome.cpp_types import App, Nameable, PollingComponent, esphome_ns
|
||||
|
||||
ESP_PLATFORMS = [ESP_PLATFORM_ESP32]
|
||||
|
||||
ESP32Camera = esphome_ns.class_('ESP32Camera', PollingComponent, Nameable)
|
||||
ESP32CameraFrameSize = esphome_ns.enum('ESP32CameraFrameSize')
|
||||
FRAME_SIZES = {
|
||||
'160X120': ESP32CameraFrameSize.ESP32_CAMERA_SIZE_160X120,
|
||||
'QQVGA': ESP32CameraFrameSize.ESP32_CAMERA_SIZE_160X120,
|
||||
'128x160': ESP32CameraFrameSize.ESP32_CAMERA_SIZE_128X160,
|
||||
'QQVGA2': ESP32CameraFrameSize.ESP32_CAMERA_SIZE_128X160,
|
||||
'176X144': ESP32CameraFrameSize.ESP32_CAMERA_SIZE_176X144,
|
||||
'QCIF': ESP32CameraFrameSize.ESP32_CAMERA_SIZE_176X144,
|
||||
'240X176': ESP32CameraFrameSize.ESP32_CAMERA_SIZE_240X176,
|
||||
'HQVGA': ESP32CameraFrameSize.ESP32_CAMERA_SIZE_240X176,
|
||||
'320X240': ESP32CameraFrameSize.ESP32_CAMERA_SIZE_320X240,
|
||||
'QVGA': ESP32CameraFrameSize.ESP32_CAMERA_SIZE_320X240,
|
||||
'400X296': ESP32CameraFrameSize.ESP32_CAMERA_SIZE_400X296,
|
||||
'CIF': ESP32CameraFrameSize.ESP32_CAMERA_SIZE_400X296,
|
||||
'640X480': ESP32CameraFrameSize.ESP32_CAMERA_SIZE_640X480,
|
||||
'VGA': ESP32CameraFrameSize.ESP32_CAMERA_SIZE_640X480,
|
||||
'800X600': ESP32CameraFrameSize.ESP32_CAMERA_SIZE_800X600,
|
||||
'SVGA': ESP32CameraFrameSize.ESP32_CAMERA_SIZE_800X600,
|
||||
'1024X768': ESP32CameraFrameSize.ESP32_CAMERA_SIZE_1024X768,
|
||||
'XGA': ESP32CameraFrameSize.ESP32_CAMERA_SIZE_1024X768,
|
||||
'1280x1024': ESP32CameraFrameSize.ESP32_CAMERA_SIZE_1280X1024,
|
||||
'SXGA': ESP32CameraFrameSize.ESP32_CAMERA_SIZE_1280X1024,
|
||||
'1600X1200': ESP32CameraFrameSize.ESP32_CAMERA_SIZE_1600X1200,
|
||||
'UXGA': ESP32CameraFrameSize.ESP32_CAMERA_SIZE_1600X1200,
|
||||
}
|
||||
|
||||
CONF_DATA_PINS = 'data_pins'
|
||||
CONF_VSYNC_PIN = 'vsync_pin'
|
||||
CONF_HREF_PIN = 'href_pin'
|
||||
CONF_PIXEL_CLOCK_PIN = 'pixel_clock_pin'
|
||||
CONF_EXTERNAL_CLOCK = 'external_clock'
|
||||
CONF_I2C_PINS = 'i2c_pins'
|
||||
CONF_RESET_PIN = 'reset_pin'
|
||||
CONF_POWER_DOWN_PIN = 'power_down_pin'
|
||||
|
||||
CONF_MAX_FRAMERATE = 'max_framerate'
|
||||
CONF_IDLE_FRAMERATE = 'idle_framerate'
|
||||
CONF_RESOLUTION = 'resolution'
|
||||
CONF_JPEG_QUALITY = 'jpeg_quality'
|
||||
CONF_VERTICAL_FLIP = 'vertical_flip'
|
||||
CONF_HORIZONTAL_MIRROR = 'horizontal_mirror'
|
||||
CONF_CONTRAST = 'contrast'
|
||||
CONF_BRIGHTNESS = 'brightness'
|
||||
CONF_SATURATION = 'saturation'
|
||||
CONF_TEST_PATTERN = 'test_pattern'
|
||||
|
||||
camera_range_param = vol.All(cv.int_, vol.Range(min=-2, max=2))
|
||||
|
||||
CONFIG_SCHEMA = cv.Schema({
|
||||
cv.GenerateID(): cv.declare_variable_id(ESP32Camera),
|
||||
vol.Required(CONF_NAME): cv.string,
|
||||
vol.Required(CONF_DATA_PINS): vol.All([pins.input_pin], vol.Length(min=8, max=8)),
|
||||
vol.Required(CONF_VSYNC_PIN): pins.input_pin,
|
||||
vol.Required(CONF_HREF_PIN): pins.input_pin,
|
||||
vol.Required(CONF_PIXEL_CLOCK_PIN): pins.input_pin,
|
||||
vol.Required(CONF_EXTERNAL_CLOCK): vol.Schema({
|
||||
vol.Required(CONF_PIN): pins.output_pin,
|
||||
vol.Optional(CONF_FREQUENCY, default='20MHz'): vol.All(cv.frequency, vol.In([20e6, 10e6])),
|
||||
}),
|
||||
vol.Required(CONF_I2C_PINS): vol.Schema({
|
||||
vol.Required(CONF_SDA): pins.output_pin,
|
||||
vol.Required(CONF_SCL): pins.output_pin,
|
||||
}),
|
||||
vol.Optional(CONF_RESET_PIN): pins.output_pin,
|
||||
vol.Optional(CONF_POWER_DOWN_PIN): pins.output_pin,
|
||||
|
||||
vol.Optional(CONF_MAX_FRAMERATE, default='10 fps'): vol.All(cv.framerate,
|
||||
vol.Range(min=0, min_included=False,
|
||||
max=60)),
|
||||
vol.Optional(CONF_IDLE_FRAMERATE, default='0.1 fps'): vol.All(cv.framerate,
|
||||
vol.Range(min=0, max=1)),
|
||||
vol.Optional(CONF_RESOLUTION, default='640X480'): cv.one_of(*FRAME_SIZES, upper=True),
|
||||
vol.Optional(CONF_JPEG_QUALITY, default=10): vol.All(cv.int_, vol.Range(min=10, max=63)),
|
||||
vol.Optional(CONF_CONTRAST, default=0): camera_range_param,
|
||||
vol.Optional(CONF_BRIGHTNESS, default=0): camera_range_param,
|
||||
vol.Optional(CONF_SATURATION, default=0): camera_range_param,
|
||||
vol.Optional(CONF_VERTICAL_FLIP, default=True): cv.boolean,
|
||||
vol.Optional(CONF_HORIZONTAL_MIRROR, default=True): cv.boolean,
|
||||
vol.Optional(CONF_TEST_PATTERN, default=False): cv.boolean,
|
||||
}).extend(cv.COMPONENT_SCHEMA.schema)
|
||||
|
||||
SETTERS = {
|
||||
CONF_DATA_PINS: 'set_data_pins',
|
||||
CONF_VSYNC_PIN: 'set_vsync_pin',
|
||||
CONF_HREF_PIN: 'set_href_pin',
|
||||
CONF_PIXEL_CLOCK_PIN: 'set_pixel_clock_pin',
|
||||
CONF_RESET_PIN: 'set_reset_pin',
|
||||
CONF_POWER_DOWN_PIN: 'set_power_down_pin',
|
||||
CONF_JPEG_QUALITY: 'set_jpeg_quality',
|
||||
CONF_VERTICAL_FLIP: 'set_vertical_flip',
|
||||
CONF_HORIZONTAL_MIRROR: 'set_horizontal_mirror',
|
||||
CONF_CONTRAST: 'set_contrast',
|
||||
CONF_BRIGHTNESS: 'set_brightness',
|
||||
CONF_SATURATION: 'set_saturation',
|
||||
CONF_TEST_PATTERN: 'set_test_pattern',
|
||||
}
|
||||
|
||||
|
||||
def to_code(config):
|
||||
rhs = App.register_component(ESP32Camera.new(config[CONF_NAME]))
|
||||
cam = Pvariable(config[CONF_ID], rhs)
|
||||
|
||||
for key, setter in SETTERS.items():
|
||||
if key in config:
|
||||
add(getattr(cam, setter)(config[key]))
|
||||
|
||||
extclk = config[CONF_EXTERNAL_CLOCK]
|
||||
add(cam.set_external_clock(extclk[CONF_PIN], extclk[CONF_FREQUENCY]))
|
||||
i2c_pins = config[CONF_I2C_PINS]
|
||||
add(cam.set_i2c_pins(i2c_pins[CONF_SDA], i2c_pins[CONF_SCL]))
|
||||
add(cam.set_max_update_interval(1000 / config[CONF_MAX_FRAMERATE]))
|
||||
if config[CONF_IDLE_FRAMERATE] == 0:
|
||||
add(cam.set_idle_update_interval(0))
|
||||
else:
|
||||
add(cam.set_idle_update_interval(1000 / config[CONF_IDLE_FRAMERATE]))
|
||||
add(cam.set_frame_size(FRAME_SIZES[config[CONF_RESOLUTION]]))
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_ESP32_CAMERA'
|
||||
@@ -1,85 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome import pins
|
||||
from esphome.components import wifi
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_DOMAIN, CONF_ID, CONF_MANUAL_IP, CONF_STATIC_IP, CONF_TYPE, \
|
||||
CONF_USE_ADDRESS, ESP_PLATFORM_ESP32
|
||||
from esphome.core import CORE
|
||||
from esphome.cpp_generator import Pvariable, add
|
||||
from esphome.cpp_helpers import gpio_output_pin_expression
|
||||
from esphome.cpp_types import App, Component, esphome_ns, global_ns
|
||||
|
||||
CONFLICTS_WITH = ['wifi']
|
||||
ESP_PLATFORMS = [ESP_PLATFORM_ESP32]
|
||||
|
||||
CONF_PHY_ADDR = 'phy_addr'
|
||||
CONF_MDC_PIN = 'mdc_pin'
|
||||
CONF_MDIO_PIN = 'mdio_pin'
|
||||
CONF_CLK_MODE = 'clk_mode'
|
||||
CONF_POWER_PIN = 'power_pin'
|
||||
|
||||
EthernetType = esphome_ns.enum('EthernetType')
|
||||
ETHERNET_TYPES = {
|
||||
'LAN8720': EthernetType.ETHERNET_TYPE_LAN8720,
|
||||
'TLK110': EthernetType.ETHERNET_TYPE_TLK110,
|
||||
}
|
||||
|
||||
eth_clock_mode_t = global_ns.enum('eth_clock_mode_t')
|
||||
CLK_MODES = {
|
||||
'GPIO0_IN': eth_clock_mode_t.ETH_CLOCK_GPIO0_IN,
|
||||
'GPIO0_OUT': eth_clock_mode_t.ETH_CLOCK_GPIO0_OUT,
|
||||
'GPIO16_OUT': eth_clock_mode_t.ETH_CLOCK_GPIO16_OUT,
|
||||
'GPIO17_OUT': eth_clock_mode_t.ETH_CLOCK_GPIO17_OUT,
|
||||
}
|
||||
|
||||
EthernetComponent = esphome_ns.class_('EthernetComponent', Component)
|
||||
|
||||
|
||||
def validate(config):
|
||||
if CONF_USE_ADDRESS not in config:
|
||||
if CONF_MANUAL_IP in config:
|
||||
use_address = str(config[CONF_MANUAL_IP][CONF_STATIC_IP])
|
||||
else:
|
||||
use_address = CORE.name + config[CONF_DOMAIN]
|
||||
config[CONF_USE_ADDRESS] = use_address
|
||||
return config
|
||||
|
||||
|
||||
CONFIG_SCHEMA = vol.All(cv.Schema({
|
||||
cv.GenerateID(): cv.declare_variable_id(EthernetComponent),
|
||||
vol.Required(CONF_TYPE): cv.one_of(*ETHERNET_TYPES, upper=True),
|
||||
vol.Required(CONF_MDC_PIN): pins.output_pin,
|
||||
vol.Required(CONF_MDIO_PIN): pins.input_output_pin,
|
||||
vol.Optional(CONF_CLK_MODE, default='GPIO0_IN'): cv.one_of(*CLK_MODES, upper=True, space='_'),
|
||||
vol.Optional(CONF_PHY_ADDR, default=0): vol.All(cv.int_, vol.Range(min=0, max=31)),
|
||||
vol.Optional(CONF_POWER_PIN): pins.gpio_output_pin_schema,
|
||||
vol.Optional(CONF_MANUAL_IP): wifi.STA_MANUAL_IP_SCHEMA,
|
||||
vol.Optional(CONF_DOMAIN, default='.local'): cv.domain_name,
|
||||
vol.Optional(CONF_USE_ADDRESS): cv.string_strict,
|
||||
|
||||
vol.Optional('hostname'): cv.invalid("The hostname option has been removed in 1.11.0"),
|
||||
}), validate)
|
||||
|
||||
|
||||
def to_code(config):
|
||||
rhs = App.init_ethernet()
|
||||
eth = Pvariable(config[CONF_ID], rhs)
|
||||
|
||||
add(eth.set_phy_addr(config[CONF_PHY_ADDR]))
|
||||
add(eth.set_mdc_pin(config[CONF_MDC_PIN]))
|
||||
add(eth.set_mdio_pin(config[CONF_MDIO_PIN]))
|
||||
add(eth.set_type(ETHERNET_TYPES[config[CONF_TYPE]]))
|
||||
add(eth.set_clk_mode(CLK_MODES[config[CONF_CLK_MODE]]))
|
||||
add(eth.set_use_address(config[CONF_USE_ADDRESS]))
|
||||
|
||||
if CONF_POWER_PIN in config:
|
||||
for pin in gpio_output_pin_expression(config[CONF_POWER_PIN]):
|
||||
yield
|
||||
add(eth.set_power_pin(pin))
|
||||
|
||||
if CONF_MANUAL_IP in config:
|
||||
add(eth.set_manual_ip(wifi.manual_ip(config[CONF_MANUAL_IP])))
|
||||
|
||||
|
||||
REQUIRED_BUILD_FLAGS = '-DUSE_ETHERNET'
|
||||
@@ -1,119 +0,0 @@
|
||||
# coding=utf-8
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome import core
|
||||
from esphome.components import display
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_FILE, CONF_GLYPHS, CONF_ID, CONF_SIZE
|
||||
from esphome.core import CORE, HexInt
|
||||
from esphome.cpp_generator import Pvariable, progmem_array, safe_exp
|
||||
from esphome.cpp_types import App, uint8
|
||||
from esphome.py_compat import sort_by_cmp
|
||||
|
||||
DEPENDENCIES = ['display']
|
||||
MULTI_CONF = True
|
||||
|
||||
Font = display.display_ns.class_('Font')
|
||||
Glyph = display.display_ns.class_('Glyph')
|
||||
|
||||
|
||||
def validate_glyphs(value):
|
||||
if isinstance(value, list):
|
||||
value = cv.Schema([cv.string])(value)
|
||||
value = cv.Schema([cv.string])(list(value))
|
||||
|
||||
def comparator(x, y):
|
||||
x_ = x.encode('utf-8')
|
||||
y_ = y.encode('utf-8')
|
||||
|
||||
for c in range(min(len(x_), len(y_))):
|
||||
if x_[c] < y_[c]:
|
||||
return -1
|
||||
if x_[c] > y_[c]:
|
||||
return 1
|
||||
|
||||
if len(x_) < len(y_):
|
||||
return -1
|
||||
if len(x_) > len(y_):
|
||||
return 1
|
||||
raise vol.Invalid(u"Found duplicate glyph {}".format(x))
|
||||
|
||||
sort_by_cmp(value, comparator)
|
||||
return value
|
||||
|
||||
|
||||
def validate_pillow_installed(value):
|
||||
try:
|
||||
import PIL
|
||||
except ImportError:
|
||||
raise vol.Invalid("Please install the pillow python package to use this feature. "
|
||||
"(pip install pillow)")
|
||||
|
||||
if PIL.__version__[0] < '4':
|
||||
raise vol.Invalid("Please update your pillow installation to at least 4.0.x. "
|
||||
"(pip install -U pillow)")
|
||||
|
||||
return value
|
||||
|
||||
|
||||
def validate_truetype_file(value):
|
||||
if value.endswith('.zip'): # for Google Fonts downloads
|
||||
raise vol.Invalid(u"Please unzip the font archive '{}' first and then use the .ttf files "
|
||||
u"inside.".format(value))
|
||||
if not value.endswith('.ttf'):
|
||||
raise vol.Invalid(u"Only truetype (.ttf) files are supported. Please make sure you're "
|
||||
u"using the correct format or rename the extension to .ttf")
|
||||
return cv.file_(value)
|
||||
|
||||
|
||||
DEFAULT_GLYPHS = u' !"%()+,-.:0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZ_abcdefghijklmnopqrstuvwxyz°'
|
||||
CONF_RAW_DATA_ID = 'raw_data_id'
|
||||
|
||||
FONT_SCHEMA = cv.Schema({
|
||||
vol.Required(CONF_ID): cv.declare_variable_id(Font),
|
||||
vol.Required(CONF_FILE): validate_truetype_file,
|
||||
vol.Optional(CONF_GLYPHS, default=DEFAULT_GLYPHS): validate_glyphs,
|
||||
vol.Optional(CONF_SIZE, default=20): vol.All(cv.int_, vol.Range(min=1)),
|
||||
cv.GenerateID(CONF_RAW_DATA_ID): cv.declare_variable_id(uint8),
|
||||
})
|
||||
|
||||
CONFIG_SCHEMA = vol.All(validate_pillow_installed, FONT_SCHEMA)
|
||||
|
||||
|
||||
def to_code(config):
|
||||
from PIL import ImageFont
|
||||
|
||||
path = CORE.relative_path(config[CONF_FILE])
|
||||
try:
|
||||
font = ImageFont.truetype(path, config[CONF_SIZE])
|
||||
except Exception as e:
|
||||
raise core.EsphomeError(u"Could not load truetype file {}: {}".format(path, e))
|
||||
|
||||
ascent, descent = font.getmetrics()
|
||||
|
||||
glyph_args = {}
|
||||
data = []
|
||||
for glyph in config[CONF_GLYPHS]:
|
||||
mask = font.getmask(glyph, mode='1')
|
||||
_, (offset_x, offset_y) = font.font.getsize(glyph)
|
||||
width, height = mask.size
|
||||
width8 = ((width + 7) // 8) * 8
|
||||
glyph_data = [0 for _ in range(height * width8 // 8)] # noqa: F812
|
||||
for y in range(height):
|
||||
for x in range(width):
|
||||
if not mask.getpixel((x, y)):
|
||||
continue
|
||||
pos = x + y * width8
|
||||
glyph_data[pos // 8] |= 0x80 >> (pos % 8)
|
||||
glyph_args[glyph] = (len(data), offset_x, offset_y, width, height)
|
||||
data += glyph_data
|
||||
|
||||
rhs = safe_exp([HexInt(x) for x in data])
|
||||
prog_arr = progmem_array(config[CONF_RAW_DATA_ID], rhs)
|
||||
|
||||
glyphs = []
|
||||
for glyph in config[CONF_GLYPHS]:
|
||||
glyphs.append(Glyph(glyph, prog_arr, *glyph_args[glyph]))
|
||||
|
||||
rhs = App.make_font(glyphs, ascent, ascent + descent)
|
||||
Pvariable(config[CONF_ID], rhs)
|
||||
@@ -1,35 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome import config_validation as cv
|
||||
from esphome.const import CONF_ID, CONF_INITIAL_VALUE, CONF_RESTORE_VALUE, CONF_TYPE
|
||||
from esphome.cpp_generator import Pvariable, RawExpression, TemplateArguments, add
|
||||
from esphome.cpp_helpers import setup_component
|
||||
from esphome.cpp_types import App, Component, esphome_ns
|
||||
|
||||
GlobalVariableComponent = esphome_ns.class_('GlobalVariableComponent', Component)
|
||||
|
||||
MULTI_CONF = True
|
||||
|
||||
CONFIG_SCHEMA = cv.Schema({
|
||||
vol.Required(CONF_ID): cv.declare_variable_id(GlobalVariableComponent),
|
||||
vol.Required(CONF_TYPE): cv.string_strict,
|
||||
vol.Optional(CONF_INITIAL_VALUE): cv.string_strict,
|
||||
vol.Optional(CONF_RESTORE_VALUE): cv.boolean,
|
||||
}).extend(cv.COMPONENT_SCHEMA.schema)
|
||||
|
||||
|
||||
def to_code(config):
|
||||
type_ = RawExpression(config[CONF_TYPE])
|
||||
template_args = TemplateArguments(type_)
|
||||
res_type = GlobalVariableComponent.template(template_args)
|
||||
initial_value = None
|
||||
if CONF_INITIAL_VALUE in config:
|
||||
initial_value = RawExpression(config[CONF_INITIAL_VALUE])
|
||||
rhs = App.Pmake_global_variable(template_args, initial_value)
|
||||
glob = Pvariable(config[CONF_ID], rhs, type=res_type)
|
||||
|
||||
if config.get(CONF_RESTORE_VALUE, False):
|
||||
hash_ = hash(config[CONF_ID].id) % 2**32
|
||||
add(glob.set_restore_value(hash_))
|
||||
|
||||
setup_component(glob, config)
|
||||
@@ -1,63 +0,0 @@
|
||||
# coding=utf-8
|
||||
import logging
|
||||
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome import core
|
||||
from esphome.components import display, font
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_FILE, CONF_ID, CONF_RESIZE
|
||||
from esphome.core import CORE, HexInt
|
||||
from esphome.cpp_generator import Pvariable, progmem_array, safe_exp
|
||||
from esphome.cpp_types import App, uint8
|
||||
|
||||
_LOGGER = logging.getLogger(__name__)
|
||||
|
||||
DEPENDENCIES = ['display']
|
||||
MULTI_CONF = True
|
||||
|
||||
Image_ = display.display_ns.class_('Image')
|
||||
|
||||
CONF_RAW_DATA_ID = 'raw_data_id'
|
||||
|
||||
IMAGE_SCHEMA = cv.Schema({
|
||||
vol.Required(CONF_ID): cv.declare_variable_id(Image_),
|
||||
vol.Required(CONF_FILE): cv.file_,
|
||||
vol.Optional(CONF_RESIZE): cv.dimensions,
|
||||
cv.GenerateID(CONF_RAW_DATA_ID): cv.declare_variable_id(uint8),
|
||||
})
|
||||
|
||||
CONFIG_SCHEMA = vol.All(font.validate_pillow_installed, IMAGE_SCHEMA)
|
||||
|
||||
|
||||
def to_code(config):
|
||||
from PIL import Image
|
||||
|
||||
path = CORE.relative_path(config[CONF_FILE])
|
||||
try:
|
||||
image = Image.open(path)
|
||||
except Exception as e:
|
||||
raise core.EsphomeError(u"Could not load image file {}: {}".format(path, e))
|
||||
|
||||
if CONF_RESIZE in config:
|
||||
image.thumbnail(config[CONF_RESIZE])
|
||||
|
||||
image = image.convert('1', dither=Image.NONE)
|
||||
width, height = image.size
|
||||
if width > 500 or height > 500:
|
||||
_LOGGER.warning("The image you requested is very big. Please consider using the resize "
|
||||
"parameter")
|
||||
width8 = ((width + 7) // 8) * 8
|
||||
data = [0 for _ in range(height * width8 // 8)]
|
||||
for y in range(height):
|
||||
for x in range(width):
|
||||
if image.getpixel((x, y)):
|
||||
continue
|
||||
pos = x + y * width8
|
||||
data[pos // 8] |= 0x80 >> (pos % 8)
|
||||
|
||||
rhs = safe_exp([HexInt(x) for x in data])
|
||||
prog_arr = progmem_array(config[CONF_RAW_DATA_ID], rhs)
|
||||
|
||||
rhs = App.make_image(prog_arr, width, height)
|
||||
Pvariable(config[CONF_ID], rhs)
|
||||
@@ -1,24 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome import automation
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_ID, CONF_INTERVAL
|
||||
from esphome.cpp_generator import Pvariable
|
||||
from esphome.cpp_helpers import setup_component
|
||||
from esphome.cpp_types import App, PollingComponent, Trigger, esphome_ns
|
||||
|
||||
IntervalTrigger = esphome_ns.class_('IntervalTrigger', Trigger.template(), PollingComponent)
|
||||
|
||||
CONFIG_SCHEMA = automation.validate_automation(cv.Schema({
|
||||
cv.GenerateID(): cv.declare_variable_id(IntervalTrigger),
|
||||
vol.Required(CONF_INTERVAL): cv.positive_time_period_milliseconds,
|
||||
}).extend(cv.COMPONENT_SCHEMA.schema))
|
||||
|
||||
|
||||
def to_code(config):
|
||||
for conf in config:
|
||||
rhs = App.register_component(IntervalTrigger.new(conf[CONF_INTERVAL]))
|
||||
trigger = Pvariable(conf[CONF_ID], rhs)
|
||||
setup_component(trigger, conf)
|
||||
|
||||
automation.build_automations(trigger, [], conf)
|
||||
@@ -1,195 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome import pins
|
||||
from esphome.components import light
|
||||
from esphome.components.light import AddressableLight
|
||||
from esphome.components.power_supply import PowerSupplyComponent
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_CLOCK_PIN, CONF_COLOR_CORRECT, CONF_DATA_PIN, \
|
||||
CONF_DEFAULT_TRANSITION_LENGTH, CONF_EFFECTS, CONF_GAMMA_CORRECT, CONF_MAKE_ID, CONF_METHOD, \
|
||||
CONF_NAME, CONF_NUM_LEDS, CONF_PIN, CONF_POWER_SUPPLY, CONF_TYPE, CONF_VARIANT
|
||||
from esphome.core import CORE
|
||||
from esphome.cpp_generator import TemplateArguments, add, get_variable, variable
|
||||
from esphome.cpp_helpers import setup_component
|
||||
from esphome.cpp_types import App, Application, Component, global_ns
|
||||
|
||||
NeoPixelBusLightOutputBase = light.light_ns.class_('NeoPixelBusLightOutputBase', Component,
|
||||
AddressableLight)
|
||||
ESPNeoPixelOrder = light.light_ns.namespace('ESPNeoPixelOrder')
|
||||
|
||||
|
||||
def validate_type(value):
|
||||
value = cv.string(value).upper()
|
||||
if 'R' not in value:
|
||||
raise vol.Invalid("Must have R in type")
|
||||
if 'G' not in value:
|
||||
raise vol.Invalid("Must have G in type")
|
||||
if 'B' not in value:
|
||||
raise vol.Invalid("Must have B in type")
|
||||
rest = set(value) - set('RGBW')
|
||||
if rest:
|
||||
raise vol.Invalid("Type has invalid color: {}".format(', '.join(rest)))
|
||||
if len(set(value)) != len(value):
|
||||
raise vol.Invalid("Type has duplicate color!")
|
||||
return value
|
||||
|
||||
|
||||
def validate_variant(value):
|
||||
value = cv.string(value).upper()
|
||||
if value == 'WS2813':
|
||||
value = 'WS2812X'
|
||||
if value == 'WS2812':
|
||||
value = '800KBPS'
|
||||
if value == 'LC8812':
|
||||
value = 'SK6812'
|
||||
return cv.one_of(*VARIANTS)(value)
|
||||
|
||||
|
||||
def validate_method(value):
|
||||
if value is None:
|
||||
if CORE.is_esp32:
|
||||
return 'ESP32_I2S_1'
|
||||
if CORE.is_esp8266:
|
||||
return 'ESP8266_DMA'
|
||||
raise NotImplementedError
|
||||
|
||||
if CORE.is_esp32:
|
||||
return cv.one_of(*ESP32_METHODS, upper=True, space='_')(value)
|
||||
if CORE.is_esp8266:
|
||||
return cv.one_of(*ESP8266_METHODS, upper=True, space='_')(value)
|
||||
raise NotImplementedError
|
||||
|
||||
|
||||
def validate_method_pin(value):
|
||||
method = value[CONF_METHOD]
|
||||
method_pins = {
|
||||
'ESP8266_DMA': [3],
|
||||
'ESP8266_UART0': [1],
|
||||
'ESP8266_ASYNC_UART0': [1],
|
||||
'ESP8266_UART1': [2],
|
||||
'ESP8266_ASYNC_UART1': [2],
|
||||
'ESP32_I2S_0': list(range(0, 32)),
|
||||
'ESP32_I2S_1': list(range(0, 32)),
|
||||
}
|
||||
if CORE.is_esp8266:
|
||||
method_pins['BIT_BANG'] = list(range(0, 16))
|
||||
elif CORE.is_esp32:
|
||||
method_pins['BIT_BANG'] = list(range(0, 32))
|
||||
pins_ = method_pins[method]
|
||||
for opt in (CONF_PIN, CONF_CLOCK_PIN, CONF_DATA_PIN):
|
||||
if opt in value and value[opt] not in pins_:
|
||||
raise vol.Invalid("Method {} only supports pin(s) {}".format(
|
||||
method, ', '.join('GPIO{}'.format(x) for x in pins_)
|
||||
), path=[CONF_METHOD])
|
||||
return value
|
||||
|
||||
|
||||
VARIANTS = {
|
||||
'WS2812X': 'Ws2812x',
|
||||
'SK6812': 'Sk6812',
|
||||
'800KBPS': '800Kbps',
|
||||
'400KBPS': '400Kbps',
|
||||
}
|
||||
|
||||
ESP8266_METHODS = {
|
||||
'ESP8266_DMA': 'NeoEsp8266Dma{}Method',
|
||||
'ESP8266_UART0': 'NeoEsp8266Uart0{}Method',
|
||||
'ESP8266_UART1': 'NeoEsp8266Uart1{}Method',
|
||||
'ESP8266_ASYNC_UART0': 'NeoEsp8266AsyncUart0{}Method',
|
||||
'ESP8266_ASYNC_UART1': 'NeoEsp8266AsyncUart1{}Method',
|
||||
'BIT_BANG': 'NeoEsp8266BitBang{}Method',
|
||||
}
|
||||
ESP32_METHODS = {
|
||||
'ESP32_I2S_0': 'NeoEsp32I2s0{}Method',
|
||||
'ESP32_I2S_1': 'NeoEsp32I2s1{}Method',
|
||||
'BIT_BANG': 'NeoEsp32BitBang{}Method',
|
||||
}
|
||||
|
||||
|
||||
def format_method(config):
|
||||
variant = VARIANTS[config[CONF_VARIANT]]
|
||||
method = config[CONF_METHOD]
|
||||
if CORE.is_esp8266:
|
||||
return ESP8266_METHODS[method].format(variant)
|
||||
if CORE.is_esp32:
|
||||
return ESP32_METHODS[method].format(variant)
|
||||
raise NotImplementedError
|
||||
|
||||
|
||||
def validate(config):
|
||||
if CONF_PIN in config:
|
||||
if CONF_CLOCK_PIN in config or CONF_DATA_PIN in config:
|
||||
raise vol.Invalid("Cannot specify both 'pin' and 'clock_pin'+'data_pin'")
|
||||
return config
|
||||
if CONF_CLOCK_PIN in config:
|
||||
if CONF_DATA_PIN not in config:
|
||||
raise vol.Invalid("If you give clock_pin, you must also specify data_pin")
|
||||
return config
|
||||
raise vol.Invalid("Must specify at least one of 'pin' or 'clock_pin'+'data_pin'")
|
||||
|
||||
|
||||
MakeNeoPixelBusLight = Application.struct('MakeNeoPixelBusLight')
|
||||
|
||||
PLATFORM_SCHEMA = cv.nameable(light.LIGHT_PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(light.AddressableLightState),
|
||||
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(MakeNeoPixelBusLight),
|
||||
|
||||
vol.Optional(CONF_TYPE, default='GRB'): validate_type,
|
||||
vol.Optional(CONF_VARIANT, default='800KBPS'): validate_variant,
|
||||
vol.Optional(CONF_METHOD, default=None): validate_method,
|
||||
vol.Optional(CONF_PIN): pins.output_pin,
|
||||
vol.Optional(CONF_CLOCK_PIN): pins.output_pin,
|
||||
vol.Optional(CONF_DATA_PIN): pins.output_pin,
|
||||
|
||||
vol.Required(CONF_NUM_LEDS): cv.positive_not_null_int,
|
||||
|
||||
vol.Optional(CONF_GAMMA_CORRECT): cv.positive_float,
|
||||
vol.Optional(CONF_COLOR_CORRECT): vol.All([cv.percentage], vol.Length(min=3, max=4)),
|
||||
vol.Optional(CONF_DEFAULT_TRANSITION_LENGTH): cv.positive_time_period_milliseconds,
|
||||
vol.Optional(CONF_POWER_SUPPLY): cv.use_variable_id(PowerSupplyComponent),
|
||||
vol.Optional(CONF_EFFECTS): light.validate_effects(light.ADDRESSABLE_EFFECTS),
|
||||
}).extend(cv.COMPONENT_SCHEMA.schema), validate, validate_method_pin)
|
||||
|
||||
|
||||
def to_code(config):
|
||||
type_ = config[CONF_TYPE]
|
||||
has_white = 'W' in type_
|
||||
if has_white:
|
||||
func = App.make_neo_pixel_bus_rgbw_light
|
||||
color_feat = global_ns.NeoRgbwFeature
|
||||
else:
|
||||
func = App.make_neo_pixel_bus_rgb_light
|
||||
color_feat = global_ns.NeoRgbFeature
|
||||
|
||||
template = TemplateArguments(getattr(global_ns, format_method(config)), color_feat)
|
||||
rhs = func(template, config[CONF_NAME])
|
||||
make = variable(config[CONF_MAKE_ID], rhs, type=MakeNeoPixelBusLight.template(template))
|
||||
output = make.Poutput
|
||||
|
||||
if CONF_PIN in config:
|
||||
add(output.add_leds(config[CONF_NUM_LEDS], config[CONF_PIN]))
|
||||
else:
|
||||
add(output.add_leds(config[CONF_NUM_LEDS], config[CONF_CLOCK_PIN], config[CONF_DATA_PIN]))
|
||||
|
||||
add(output.set_pixel_order(getattr(ESPNeoPixelOrder, type_)))
|
||||
|
||||
if CONF_POWER_SUPPLY in config:
|
||||
for power_supply in get_variable(config[CONF_POWER_SUPPLY]):
|
||||
yield
|
||||
add(output.set_power_supply(power_supply))
|
||||
|
||||
if CONF_COLOR_CORRECT in config:
|
||||
add(output.set_correction(*config[CONF_COLOR_CORRECT]))
|
||||
|
||||
light.setup_light(make.Pstate, config)
|
||||
setup_component(output, config)
|
||||
|
||||
|
||||
REQUIRED_BUILD_FLAGS = '-DUSE_NEO_PIXEL_BUS_LIGHT'
|
||||
|
||||
LIB_DEPS = 'NeoPixelBus@2.4.1'
|
||||
|
||||
|
||||
def to_hass_config(data, config):
|
||||
return light.core_to_hass_config(data, config, brightness=True, rgb=True, color_temp=False,
|
||||
white_value='W' in config[CONF_TYPE])
|
||||
@@ -1,52 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome.components import light
|
||||
from esphome.components.light import AddressableLight
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_DEFAULT_TRANSITION_LENGTH, CONF_EFFECTS, CONF_FROM, CONF_ID, \
|
||||
CONF_MAKE_ID, CONF_NAME, CONF_SEGMENTS, CONF_TO
|
||||
from esphome.cpp_generator import get_variable, variable
|
||||
from esphome.cpp_types import App, Application
|
||||
|
||||
AddressableSegment = light.light_ns.class_('AddressableSegment')
|
||||
PartitionLightOutput = light.light_ns.class_('PartitionLightOutput', AddressableLight)
|
||||
MakePartitionLight = Application.struct('MakePartitionLight')
|
||||
|
||||
|
||||
def validate_from_to(value):
|
||||
if value[CONF_FROM] > value[CONF_TO]:
|
||||
raise vol.Invalid(u"From ({}) must not be larger than to ({})"
|
||||
u"".format(value[CONF_FROM], value[CONF_TO]))
|
||||
return value
|
||||
|
||||
|
||||
PLATFORM_SCHEMA = cv.nameable(light.LIGHT_PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(light.AddressableLightState),
|
||||
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(MakePartitionLight),
|
||||
|
||||
vol.Required(CONF_SEGMENTS): vol.All(cv.ensure_list({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(light.AddressableLightState),
|
||||
vol.Required(CONF_FROM): cv.positive_int,
|
||||
vol.Required(CONF_TO): cv.positive_int,
|
||||
}, validate_from_to), vol.Length(min=1)),
|
||||
|
||||
vol.Optional(CONF_DEFAULT_TRANSITION_LENGTH): cv.positive_time_period_milliseconds,
|
||||
vol.Optional(CONF_EFFECTS): light.validate_effects(light.ADDRESSABLE_EFFECTS),
|
||||
}))
|
||||
|
||||
|
||||
def to_code(config):
|
||||
segments = []
|
||||
for conf in config[CONF_SEGMENTS]:
|
||||
for var in get_variable(conf[CONF_ID]):
|
||||
yield
|
||||
segments.append(AddressableSegment(var, conf[CONF_FROM],
|
||||
conf[CONF_TO] - conf[CONF_FROM] + 1))
|
||||
|
||||
rhs = App.make_partition_light(config[CONF_NAME], segments)
|
||||
make = variable(config[CONF_MAKE_ID], rhs)
|
||||
light.setup_light(make.Pstate, config)
|
||||
|
||||
|
||||
def to_hass_config(data, config):
|
||||
return light.core_to_hass_config(data, config, brightness=True, rgb=True, color_temp=False)
|
||||
@@ -1,184 +0,0 @@
|
||||
import re
|
||||
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome.automation import ACTION_REGISTRY, LambdaAction
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_ARGS, CONF_BAUD_RATE, CONF_FORMAT, CONF_HARDWARE_UART, CONF_ID, \
|
||||
CONF_LEVEL, CONF_LOGS, CONF_TAG, CONF_TX_BUFFER_SIZE
|
||||
from esphome.core import CORE, EsphomeError, Lambda
|
||||
from esphome.cpp_generator import Pvariable, RawExpression, add, process_lambda, statement
|
||||
from esphome.cpp_types import App, Component, esphome_ns, global_ns, void
|
||||
from esphome.py_compat import text_type
|
||||
|
||||
LOG_LEVELS = {
|
||||
'NONE': global_ns.ESPHOME_LOG_LEVEL_NONE,
|
||||
'ERROR': global_ns.ESPHOME_LOG_LEVEL_ERROR,
|
||||
'WARN': global_ns.ESPHOME_LOG_LEVEL_WARN,
|
||||
'INFO': global_ns.ESPHOME_LOG_LEVEL_INFO,
|
||||
'DEBUG': global_ns.ESPHOME_LOG_LEVEL_DEBUG,
|
||||
'VERBOSE': global_ns.ESPHOME_LOG_LEVEL_VERBOSE,
|
||||
'VERY_VERBOSE': global_ns.ESPHOME_LOG_LEVEL_VERY_VERBOSE,
|
||||
}
|
||||
|
||||
LOG_LEVEL_TO_ESP_LOG = {
|
||||
'ERROR': global_ns.ESP_LOGE,
|
||||
'WARN': global_ns.ESP_LOGW,
|
||||
'INFO': global_ns.ESP_LOGI,
|
||||
'DEBUG': global_ns.ESP_LOGD,
|
||||
'VERBOSE': global_ns.ESP_LOGV,
|
||||
'VERY_VERBOSE': global_ns.ESP_LOGVV,
|
||||
}
|
||||
|
||||
LOG_LEVEL_SEVERITY = ['NONE', 'ERROR', 'WARN', 'INFO', 'DEBUG', 'VERBOSE', 'VERY_VERBOSE']
|
||||
|
||||
UART_SELECTION_ESP32 = ['UART0', 'UART1', 'UART2']
|
||||
|
||||
UART_SELECTION_ESP8266 = ['UART0', 'UART0_SWAP', 'UART1']
|
||||
|
||||
HARDWARE_UART_TO_UART_SELECTION = {
|
||||
'UART0': global_ns.UART_SELECTION_UART0,
|
||||
'UART0_SWAP': global_ns.UART_SELECTION_UART0_SWAP,
|
||||
'UART1': global_ns.UART_SELECTION_UART1,
|
||||
'UART2': global_ns.UART_SELECTION_UART2,
|
||||
}
|
||||
|
||||
HARDWARE_UART_TO_SERIAL = {
|
||||
'UART0': 'Serial',
|
||||
'UART0_SWAP': 'Serial',
|
||||
'UART1': 'Serial1',
|
||||
'UART2': 'Serial2',
|
||||
}
|
||||
|
||||
# pylint: disable=invalid-name
|
||||
is_log_level = cv.one_of(*LOG_LEVELS, upper=True)
|
||||
|
||||
|
||||
def uart_selection(value):
|
||||
if CORE.is_esp32:
|
||||
return cv.one_of(*UART_SELECTION_ESP32, upper=True)(value)
|
||||
if CORE.is_esp8266:
|
||||
return cv.one_of(*UART_SELECTION_ESP8266, upper=True)(value)
|
||||
raise NotImplementedError
|
||||
|
||||
|
||||
def validate_local_no_higher_than_global(value):
|
||||
global_level = value.get(CONF_LEVEL, 'DEBUG')
|
||||
for tag, level in value.get(CONF_LOGS, {}).items():
|
||||
if LOG_LEVEL_SEVERITY.index(level) > LOG_LEVEL_SEVERITY.index(global_level):
|
||||
raise EsphomeError(u"The local log level {} for {} must be less severe than the "
|
||||
u"global log level {}.".format(level, tag, global_level))
|
||||
return value
|
||||
|
||||
|
||||
LogComponent = esphome_ns.class_('LogComponent', Component)
|
||||
|
||||
CONFIG_SCHEMA = vol.All(cv.Schema({
|
||||
cv.GenerateID(): cv.declare_variable_id(LogComponent),
|
||||
vol.Optional(CONF_BAUD_RATE, default=115200): cv.positive_int,
|
||||
vol.Optional(CONF_TX_BUFFER_SIZE, default=512): cv.validate_bytes,
|
||||
vol.Optional(CONF_HARDWARE_UART, default='UART0'): uart_selection,
|
||||
vol.Optional(CONF_LEVEL): is_log_level,
|
||||
vol.Optional(CONF_LOGS): cv.Schema({
|
||||
cv.string: is_log_level,
|
||||
})
|
||||
}), validate_local_no_higher_than_global)
|
||||
|
||||
|
||||
def to_code(config):
|
||||
rhs = App.init_log(config.get(CONF_BAUD_RATE),
|
||||
config.get(CONF_TX_BUFFER_SIZE),
|
||||
HARDWARE_UART_TO_UART_SELECTION[config.get(CONF_HARDWARE_UART)])
|
||||
log = Pvariable(config[CONF_ID], rhs)
|
||||
if CONF_LEVEL in config:
|
||||
add(log.set_global_log_level(LOG_LEVELS[config[CONF_LEVEL]]))
|
||||
for tag, level in config.get(CONF_LOGS, {}).items():
|
||||
add(log.set_log_level(tag, LOG_LEVELS[level]))
|
||||
|
||||
|
||||
def required_build_flags(config):
|
||||
flags = []
|
||||
if CONF_LEVEL in config:
|
||||
flags.append(u'-DESPHOME_LOG_LEVEL={}'.format(str(LOG_LEVELS[config[CONF_LEVEL]])))
|
||||
this_severity = LOG_LEVEL_SEVERITY.index(config[CONF_LEVEL])
|
||||
verbose_severity = LOG_LEVEL_SEVERITY.index('VERBOSE')
|
||||
very_verbose_severity = LOG_LEVEL_SEVERITY.index('VERY_VERBOSE')
|
||||
is_at_least_verbose = this_severity >= verbose_severity
|
||||
is_at_least_very_verbose = this_severity >= very_verbose_severity
|
||||
has_serial_logging = config.get(CONF_BAUD_RATE) != 0
|
||||
if CORE.is_esp8266 and has_serial_logging and is_at_least_verbose:
|
||||
debug_serial_port = HARDWARE_UART_TO_SERIAL[config.get(CONF_HARDWARE_UART)]
|
||||
flags.append(u"-DDEBUG_ESP_PORT={}".format(debug_serial_port))
|
||||
flags.append(u"-DLWIP_DEBUG")
|
||||
DEBUG_COMPONENTS = {
|
||||
'HTTP_CLIENT',
|
||||
'HTTP_SERVER',
|
||||
'HTTP_UPDATE',
|
||||
'OTA',
|
||||
'SSL',
|
||||
'TLS_MEM',
|
||||
'UPDATER',
|
||||
'WIFI',
|
||||
}
|
||||
for comp in DEBUG_COMPONENTS:
|
||||
flags.append(u"-DDEBUG_ESP_{}".format(comp))
|
||||
if CORE.is_esp32 and is_at_least_verbose:
|
||||
flags.append('-DCORE_DEBUG_LEVEL=5')
|
||||
if CORE.is_esp32 and is_at_least_very_verbose:
|
||||
flags.append('-DENABLE_I2C_DEBUG_BUFFER')
|
||||
|
||||
return flags
|
||||
|
||||
|
||||
def maybe_simple_message(schema):
|
||||
def validator(value):
|
||||
if isinstance(value, dict):
|
||||
return cv.Schema(schema)(value)
|
||||
return cv.Schema(schema)({CONF_FORMAT: value})
|
||||
|
||||
return validator
|
||||
|
||||
|
||||
def validate_printf(value):
|
||||
# https://stackoverflow.com/questions/30011379/how-can-i-parse-a-c-format-string-in-python
|
||||
# pylint: disable=anomalous-backslash-in-string
|
||||
cfmt = u"""\
|
||||
( # start of capture group 1
|
||||
% # literal "%"
|
||||
(?: # first option
|
||||
(?:[-+0 #]{0,5}) # optional flags
|
||||
(?:\d+|\*)? # width
|
||||
(?:\.(?:\d+|\*))? # precision
|
||||
(?:h|l|ll|w|I|I32|I64)? # size
|
||||
[cCdiouxXeEfgGaAnpsSZ] # type
|
||||
) | # OR
|
||||
%%) # literal "%%"
|
||||
""" # noqa
|
||||
matches = re.findall(cfmt, value[CONF_FORMAT], flags=re.X)
|
||||
if len(matches) != len(value[CONF_ARGS]):
|
||||
raise vol.Invalid(u"Found {} printf-patterns ({}), but {} args were given!"
|
||||
u"".format(len(matches), u', '.join(matches), len(value[CONF_ARGS])))
|
||||
return value
|
||||
|
||||
|
||||
CONF_LOGGER_LOG = 'logger.log'
|
||||
LOGGER_LOG_ACTION_SCHEMA = vol.All(maybe_simple_message({
|
||||
vol.Required(CONF_FORMAT): cv.string,
|
||||
vol.Optional(CONF_ARGS, default=list): cv.ensure_list(cv.lambda_),
|
||||
vol.Optional(CONF_LEVEL, default="DEBUG"): cv.one_of(*LOG_LEVEL_TO_ESP_LOG, upper=True),
|
||||
vol.Optional(CONF_TAG, default="main"): cv.string,
|
||||
}), validate_printf)
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_LOGGER_LOG, LOGGER_LOG_ACTION_SCHEMA)
|
||||
def logger_log_action_to_code(config, action_id, template_arg, args):
|
||||
esp_log = LOG_LEVEL_TO_ESP_LOG[config[CONF_LEVEL]]
|
||||
args_ = [RawExpression(text_type(x)) for x in config[CONF_ARGS]]
|
||||
|
||||
text = text_type(statement(esp_log(config[CONF_TAG], config[CONF_FORMAT], *args_)))
|
||||
|
||||
for lambda_ in process_lambda(Lambda(text), args, return_type=void):
|
||||
yield None
|
||||
rhs = LambdaAction.new(template_arg, lambda_)
|
||||
type = LambdaAction.template(template_arg)
|
||||
yield Pvariable(action_id, rhs, type=type)
|
||||
@@ -1,35 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome import pins
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_ADDRESS, CONF_ID
|
||||
from esphome.cpp_generator import Pvariable
|
||||
from esphome.cpp_helpers import setup_component
|
||||
from esphome.cpp_types import App, GPIOInputPin, GPIOOutputPin, io_ns, esphome_ns
|
||||
|
||||
DEPENDENCIES = ['i2c']
|
||||
MULTI_CONF = True
|
||||
|
||||
MCP23017GPIOMode = esphome_ns.enum('MCP23017GPIOMode')
|
||||
MCP23017_GPIO_MODES = {
|
||||
'INPUT': MCP23017GPIOMode.MCP23017_INPUT,
|
||||
'INPUT_PULLUP': MCP23017GPIOMode.MCP23017_INPUT_PULLUP,
|
||||
'OUTPUT': MCP23017GPIOMode.MCP23017_OUTPUT,
|
||||
}
|
||||
|
||||
MCP23017GPIOInputPin = io_ns.class_('MCP23017GPIOInputPin', GPIOInputPin)
|
||||
MCP23017GPIOOutputPin = io_ns.class_('MCP23017GPIOOutputPin', GPIOOutputPin)
|
||||
|
||||
CONFIG_SCHEMA = cv.Schema({
|
||||
vol.Required(CONF_ID): cv.declare_variable_id(pins.MCP23017),
|
||||
vol.Optional(CONF_ADDRESS, default=0x20): cv.i2c_address,
|
||||
}).extend(cv.COMPONENT_SCHEMA.schema)
|
||||
|
||||
|
||||
def to_code(config):
|
||||
rhs = App.make_mcp23017_component(config[CONF_ADDRESS])
|
||||
var = Pvariable(config[CONF_ID], rhs)
|
||||
setup_component(var, config)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_MCP23017'
|
||||
@@ -1,29 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome.components import i2c, binary_sensor
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_ADDRESS, CONF_ID
|
||||
from esphome.cpp_generator import Pvariable
|
||||
from esphome.cpp_helpers import setup_component
|
||||
from esphome.cpp_types import App, Component
|
||||
|
||||
DEPENDENCIES = ['i2c']
|
||||
MULTI_CONF = True
|
||||
|
||||
CONF_MPR121_ID = 'mpr121_id'
|
||||
MPR121Component = binary_sensor.binary_sensor_ns.class_('MPR121Component', Component, i2c.I2CDevice)
|
||||
|
||||
CONFIG_SCHEMA = cv.Schema({
|
||||
cv.GenerateID(): cv.declare_variable_id(MPR121Component),
|
||||
vol.Optional(CONF_ADDRESS): cv.i2c_address
|
||||
}).extend(cv.COMPONENT_SCHEMA.schema)
|
||||
|
||||
|
||||
def to_code(config):
|
||||
rhs = App.make_mpr121(config.get(CONF_ADDRESS))
|
||||
var = Pvariable(config[CONF_ID], rhs)
|
||||
|
||||
setup_component(var, config)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_MPR121'
|
||||
@@ -1,46 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome import pins
|
||||
from esphome.components import output
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import (CONF_BIT_DEPTH, CONF_CLOCK_PIN, CONF_DATA_PIN, CONF_ID,
|
||||
CONF_NUM_CHANNELS, CONF_NUM_CHIPS, CONF_UPDATE_ON_BOOT)
|
||||
from esphome.cpp_generator import Pvariable, add
|
||||
from esphome.cpp_helpers import gpio_output_pin_expression, setup_component
|
||||
from esphome.cpp_types import App, Component
|
||||
|
||||
MY9231OutputComponent = output.output_ns.class_('MY9231OutputComponent', Component)
|
||||
MULTI_CONF = True
|
||||
|
||||
CONFIG_SCHEMA = cv.Schema({
|
||||
cv.GenerateID(): cv.declare_variable_id(MY9231OutputComponent),
|
||||
vol.Required(CONF_DATA_PIN): pins.gpio_output_pin_schema,
|
||||
vol.Required(CONF_CLOCK_PIN): pins.gpio_output_pin_schema,
|
||||
vol.Optional(CONF_NUM_CHANNELS): vol.All(vol.Coerce(int),
|
||||
vol.Range(3, 1020)),
|
||||
vol.Optional(CONF_NUM_CHIPS): vol.All(vol.Coerce(int),
|
||||
vol.Range(1, 255)),
|
||||
vol.Optional(CONF_BIT_DEPTH): cv.one_of(8, 12, 14, 16, int=True),
|
||||
vol.Optional(CONF_UPDATE_ON_BOOT): vol.Coerce(bool),
|
||||
}).extend(cv.COMPONENT_SCHEMA.schema)
|
||||
|
||||
|
||||
def to_code(config):
|
||||
for di in gpio_output_pin_expression(config[CONF_DATA_PIN]):
|
||||
yield
|
||||
for dcki in gpio_output_pin_expression(config[CONF_CLOCK_PIN]):
|
||||
yield
|
||||
rhs = App.make_my9231_component(di, dcki)
|
||||
my9231 = Pvariable(config[CONF_ID], rhs)
|
||||
if CONF_NUM_CHANNELS in config:
|
||||
add(my9231.set_num_channels(config[CONF_NUM_CHANNELS]))
|
||||
if CONF_NUM_CHIPS in config:
|
||||
add(my9231.set_num_chips(config[CONF_NUM_CHIPS]))
|
||||
if CONF_BIT_DEPTH in config:
|
||||
add(my9231.set_bit_depth(config[CONF_BIT_DEPTH]))
|
||||
if CONF_UPDATE_ON_BOOT in config:
|
||||
add(my9231.set_update(config[CONF_UPDATE_ON_BOOT]))
|
||||
setup_component(my9231, config)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_MY9231_OUTPUT'
|
||||
@@ -1,58 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome.components import output
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_ID, CONF_OUTPUTS, CONF_TYPE
|
||||
from esphome.cpp_generator import Pvariable, get_variable
|
||||
from esphome.cpp_helpers import setup_component
|
||||
|
||||
BinaryCopyOutput = output.output_ns.class_('BinaryCopyOutput', output.BinaryOutput)
|
||||
FloatCopyOutput = output.output_ns.class_('FloatCopyOutput', output.FloatOutput)
|
||||
|
||||
BINARY_SCHEMA = output.PLATFORM_SCHEMA.extend({
|
||||
vol.Required(CONF_ID): cv.declare_variable_id(BinaryCopyOutput),
|
||||
vol.Required(CONF_TYPE): 'binary',
|
||||
vol.Required(CONF_OUTPUTS): cv.ensure_list(cv.use_variable_id(output.BinaryOutput)),
|
||||
})
|
||||
|
||||
FLOAT_SCHEMA = output.PLATFORM_SCHEMA.extend({
|
||||
vol.Required(CONF_ID): cv.declare_variable_id(FloatCopyOutput),
|
||||
vol.Required(CONF_TYPE): 'float',
|
||||
vol.Required(CONF_OUTPUTS): cv.ensure_list(cv.use_variable_id(output.FloatOutput)),
|
||||
})
|
||||
|
||||
|
||||
def validate_copy_output(value):
|
||||
if not isinstance(value, dict):
|
||||
raise vol.Invalid("Value must be dict")
|
||||
type = cv.string_strict(value.get(CONF_TYPE, 'float')).lower()
|
||||
value[CONF_TYPE] = type
|
||||
if type == 'binary':
|
||||
return BINARY_SCHEMA(value)
|
||||
if type == 'float':
|
||||
return FLOAT_SCHEMA(value)
|
||||
raise vol.Invalid("type must either be binary or float, not {}!".format(type))
|
||||
|
||||
|
||||
PLATFORM_SCHEMA = validate_copy_output
|
||||
|
||||
|
||||
def to_code(config):
|
||||
outputs = []
|
||||
for out in config[CONF_OUTPUTS]:
|
||||
for var in get_variable(out):
|
||||
yield
|
||||
outputs.append(var)
|
||||
|
||||
klass = {
|
||||
'binary': BinaryCopyOutput,
|
||||
'float': FloatCopyOutput,
|
||||
}[config[CONF_TYPE]]
|
||||
rhs = klass.new(outputs)
|
||||
gpio = Pvariable(config[CONF_ID], rhs)
|
||||
|
||||
output.setup_output_platform(gpio, config)
|
||||
setup_component(gpio, config)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_COPY_OUTPUT'
|
||||
@@ -1,68 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome.components import output
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_ID, CONF_LAMBDA, CONF_OUTPUTS, CONF_TYPE
|
||||
from esphome.cpp_generator import process_lambda, variable
|
||||
from esphome.cpp_types import std_vector
|
||||
|
||||
CustomBinaryOutputConstructor = output.output_ns.class_('CustomBinaryOutputConstructor')
|
||||
CustomFloatOutputConstructor = output.output_ns.class_('CustomFloatOutputConstructor')
|
||||
|
||||
BINARY_SCHEMA = output.PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(CustomBinaryOutputConstructor),
|
||||
vol.Required(CONF_LAMBDA): cv.lambda_,
|
||||
vol.Required(CONF_TYPE): 'binary',
|
||||
vol.Required(CONF_OUTPUTS):
|
||||
cv.ensure_list(output.BINARY_OUTPUT_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(output.BinaryOutput),
|
||||
})),
|
||||
})
|
||||
|
||||
FLOAT_SCHEMA = output.PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(CustomFloatOutputConstructor),
|
||||
vol.Required(CONF_LAMBDA): cv.lambda_,
|
||||
vol.Required(CONF_TYPE): 'float',
|
||||
vol.Required(CONF_OUTPUTS):
|
||||
cv.ensure_list(output.FLOAT_OUTPUT_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(output.FloatOutput),
|
||||
})),
|
||||
})
|
||||
|
||||
|
||||
def validate_custom_output(value):
|
||||
if not isinstance(value, dict):
|
||||
raise vol.Invalid("Value must be dict")
|
||||
if CONF_TYPE not in value:
|
||||
raise vol.Invalid("type not specified!")
|
||||
type = cv.string_strict(value[CONF_TYPE]).lower()
|
||||
value[CONF_TYPE] = type
|
||||
if type == 'binary':
|
||||
return BINARY_SCHEMA(value)
|
||||
if type == 'float':
|
||||
return FLOAT_SCHEMA(value)
|
||||
raise vol.Invalid("type must either be binary or float, not {}!".format(type))
|
||||
|
||||
|
||||
PLATFORM_SCHEMA = validate_custom_output
|
||||
|
||||
|
||||
def to_code(config):
|
||||
type = config[CONF_TYPE]
|
||||
if type == 'binary':
|
||||
ret_type = output.BinaryOutputPtr
|
||||
klass = CustomBinaryOutputConstructor
|
||||
else:
|
||||
ret_type = output.FloatOutputPtr
|
||||
klass = CustomFloatOutputConstructor
|
||||
for template_ in process_lambda(config[CONF_LAMBDA], [],
|
||||
return_type=std_vector.template(ret_type)):
|
||||
yield
|
||||
|
||||
rhs = klass(template_)
|
||||
custom = variable(config[CONF_ID], rhs)
|
||||
for i, conf in enumerate(config[CONF_OUTPUTS]):
|
||||
output.register_output(custom.get_output(i), conf)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_CUSTOM_OUTPUT'
|
||||
@@ -1,32 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome.components import i2c, output
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_ADDRESS, CONF_FREQUENCY, CONF_ID
|
||||
from esphome.cpp_generator import Pvariable, add
|
||||
from esphome.cpp_helpers import setup_component
|
||||
from esphome.cpp_types import App, Component
|
||||
|
||||
DEPENDENCIES = ['i2c']
|
||||
MULTI_CONF = True
|
||||
|
||||
PCA9685OutputComponent = output.output_ns.class_('PCA9685OutputComponent',
|
||||
Component, i2c.I2CDevice)
|
||||
|
||||
CONFIG_SCHEMA = cv.Schema({
|
||||
cv.GenerateID(): cv.declare_variable_id(PCA9685OutputComponent),
|
||||
vol.Required(CONF_FREQUENCY): vol.All(cv.frequency,
|
||||
vol.Range(min=23.84, max=1525.88)),
|
||||
vol.Optional(CONF_ADDRESS): cv.i2c_address,
|
||||
}).extend(cv.COMPONENT_SCHEMA.schema)
|
||||
|
||||
|
||||
def to_code(config):
|
||||
rhs = App.make_pca9685_component(config.get(CONF_FREQUENCY))
|
||||
pca9685 = Pvariable(config[CONF_ID], rhs)
|
||||
if CONF_ADDRESS in config:
|
||||
add(pca9685.set_address(config[CONF_ADDRESS]))
|
||||
setup_component(pca9685, config)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_PCA9685_OUTPUT'
|
||||
@@ -1,46 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome import automation, pins
|
||||
from esphome.components import binary_sensor, spi
|
||||
from esphome.components.spi import SPIComponent
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_CS_PIN, CONF_ID, CONF_ON_TAG, CONF_SPI_ID, CONF_TRIGGER_ID, \
|
||||
CONF_UPDATE_INTERVAL
|
||||
from esphome.cpp_generator import Pvariable, get_variable
|
||||
from esphome.cpp_helpers import gpio_output_pin_expression, setup_component
|
||||
from esphome.cpp_types import App, PollingComponent, Trigger, std_string
|
||||
|
||||
DEPENDENCIES = ['spi']
|
||||
MULTI_CONF = True
|
||||
|
||||
PN532Component = binary_sensor.binary_sensor_ns.class_('PN532Component', PollingComponent,
|
||||
spi.SPIDevice)
|
||||
PN532Trigger = binary_sensor.binary_sensor_ns.class_('PN532Trigger', Trigger.template(std_string))
|
||||
|
||||
CONFIG_SCHEMA = cv.Schema({
|
||||
cv.GenerateID(): cv.declare_variable_id(PN532Component),
|
||||
cv.GenerateID(CONF_SPI_ID): cv.use_variable_id(SPIComponent),
|
||||
vol.Required(CONF_CS_PIN): pins.gpio_output_pin_schema,
|
||||
vol.Optional(CONF_UPDATE_INTERVAL): cv.update_interval,
|
||||
vol.Optional(CONF_ON_TAG): automation.validate_automation({
|
||||
cv.GenerateID(CONF_TRIGGER_ID): cv.declare_variable_id(PN532Trigger),
|
||||
}),
|
||||
}).extend(cv.COMPONENT_SCHEMA.schema)
|
||||
|
||||
|
||||
def to_code(config):
|
||||
for spi_ in get_variable(config[CONF_SPI_ID]):
|
||||
yield
|
||||
for cs in gpio_output_pin_expression(config[CONF_CS_PIN]):
|
||||
yield
|
||||
rhs = App.make_pn532_component(spi_, cs, config.get(CONF_UPDATE_INTERVAL))
|
||||
pn532 = Pvariable(config[CONF_ID], rhs)
|
||||
|
||||
for conf_ in config.get(CONF_ON_TAG, []):
|
||||
trigger = Pvariable(conf_[CONF_TRIGGER_ID], pn532.make_trigger())
|
||||
automation.build_automations(trigger, [(std_string, 'x')], conf_)
|
||||
|
||||
setup_component(pn532, config)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_PN532'
|
||||
@@ -1,36 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome import pins
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_ENABLE_TIME, CONF_ID, CONF_KEEP_ON_TIME, CONF_PIN
|
||||
from esphome.cpp_generator import Pvariable, add
|
||||
from esphome.cpp_helpers import gpio_output_pin_expression, setup_component
|
||||
from esphome.cpp_types import App, Component, esphome_ns
|
||||
|
||||
PowerSupplyComponent = esphome_ns.class_('PowerSupplyComponent', Component)
|
||||
|
||||
MULTI_CONF = True
|
||||
|
||||
CONFIG_SCHEMA = cv.Schema({
|
||||
vol.Required(CONF_ID): cv.declare_variable_id(PowerSupplyComponent),
|
||||
vol.Required(CONF_PIN): pins.gpio_output_pin_schema,
|
||||
vol.Optional(CONF_ENABLE_TIME): cv.positive_time_period_milliseconds,
|
||||
vol.Optional(CONF_KEEP_ON_TIME): cv.positive_time_period_milliseconds,
|
||||
}).extend(cv.COMPONENT_SCHEMA.schema)
|
||||
|
||||
|
||||
def to_code(config):
|
||||
for pin in gpio_output_pin_expression(config[CONF_PIN]):
|
||||
yield
|
||||
|
||||
rhs = App.make_power_supply(pin)
|
||||
psu = Pvariable(config[CONF_ID], rhs)
|
||||
if CONF_ENABLE_TIME in config:
|
||||
add(psu.set_enable_time(config[CONF_ENABLE_TIME]))
|
||||
if CONF_KEEP_ON_TIME in config:
|
||||
add(psu.set_keep_on_time(config[CONF_KEEP_ON_TIME]))
|
||||
|
||||
setup_component(psu, config)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_OUTPUT'
|
||||
@@ -1,27 +0,0 @@
|
||||
from esphome.components import binary_sensor, uart
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_ID, CONF_UART_ID
|
||||
from esphome.cpp_generator import Pvariable, get_variable
|
||||
from esphome.cpp_helpers import setup_component
|
||||
from esphome.cpp_types import App, Component
|
||||
|
||||
DEPENDENCIES = ['uart']
|
||||
|
||||
RDM6300Component = binary_sensor.binary_sensor_ns.class_('RDM6300Component', Component,
|
||||
uart.UARTDevice)
|
||||
|
||||
CONFIG_SCHEMA = cv.Schema({
|
||||
cv.GenerateID(): cv.declare_variable_id(RDM6300Component),
|
||||
cv.GenerateID(CONF_UART_ID): cv.use_variable_id(uart.UARTComponent),
|
||||
}).extend(cv.COMPONENT_SCHEMA.schema)
|
||||
|
||||
|
||||
def to_code(config):
|
||||
for uart_ in get_variable(config[CONF_UART_ID]):
|
||||
yield
|
||||
rhs = App.make_rdm6300_component(uart_)
|
||||
var = Pvariable(config[CONF_ID], rhs)
|
||||
setup_component(var, config)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_RDM6300'
|
||||
@@ -1,76 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome import pins
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_BUFFER_SIZE, CONF_DUMP, CONF_FILTER, CONF_ID, CONF_IDLE, \
|
||||
CONF_PIN, CONF_TOLERANCE
|
||||
from esphome.cpp_generator import Pvariable, add
|
||||
from esphome.cpp_helpers import gpio_input_pin_expression, setup_component
|
||||
from esphome.cpp_types import App, Component, esphome_ns
|
||||
from esphome.py_compat import string_types
|
||||
|
||||
remote_ns = esphome_ns.namespace('remote')
|
||||
MULTI_CONF = True
|
||||
|
||||
RemoteControlComponentBase = remote_ns.class_('RemoteControlComponentBase')
|
||||
RemoteReceiverComponent = remote_ns.class_('RemoteReceiverComponent',
|
||||
RemoteControlComponentBase,
|
||||
Component)
|
||||
|
||||
RemoteReceiveDumper = remote_ns.class_('RemoteReceiveDumper')
|
||||
|
||||
DUMPERS = {
|
||||
'jvc': remote_ns.class_('JVCDumper', RemoteReceiveDumper),
|
||||
'lg': remote_ns.class_('LGDumper', RemoteReceiveDumper),
|
||||
'nec': remote_ns.class_('NECDumper', RemoteReceiveDumper),
|
||||
'panasonic': remote_ns.class_('PanasonicDumper', RemoteReceiveDumper),
|
||||
'raw': remote_ns.class_('RawDumper', RemoteReceiveDumper),
|
||||
'samsung': remote_ns.class_('SamsungDumper', RemoteReceiveDumper),
|
||||
'sony': remote_ns.class_('SonyDumper', RemoteReceiveDumper),
|
||||
'rc_switch': remote_ns.class_('RCSwitchDumper', RemoteReceiveDumper),
|
||||
'rc5': remote_ns.class_('RC5Dumper', RemoteReceiveDumper),
|
||||
}
|
||||
|
||||
|
||||
def validate_dumpers_all(value):
|
||||
if not isinstance(value, string_types):
|
||||
raise vol.Invalid("Not valid dumpers")
|
||||
if value.upper() == "ALL":
|
||||
return list(sorted(list(DUMPERS)))
|
||||
raise vol.Invalid("Not valid dumpers")
|
||||
|
||||
|
||||
CONFIG_SCHEMA = cv.Schema({
|
||||
cv.GenerateID(): cv.declare_variable_id(RemoteReceiverComponent),
|
||||
vol.Required(CONF_PIN): vol.All(pins.internal_gpio_input_pin_schema,
|
||||
pins.validate_has_interrupt),
|
||||
vol.Optional(CONF_DUMP, default=[]):
|
||||
vol.Any(validate_dumpers_all, cv.ensure_list(cv.one_of(*DUMPERS, lower=True))),
|
||||
vol.Optional(CONF_TOLERANCE): vol.All(cv.percentage_int, vol.Range(min=0)),
|
||||
vol.Optional(CONF_BUFFER_SIZE): cv.validate_bytes,
|
||||
vol.Optional(CONF_FILTER): cv.positive_time_period_microseconds,
|
||||
vol.Optional(CONF_IDLE): cv.positive_time_period_microseconds,
|
||||
}).extend(cv.COMPONENT_SCHEMA.schema)
|
||||
|
||||
|
||||
def to_code(config):
|
||||
for pin in gpio_input_pin_expression(config[CONF_PIN]):
|
||||
yield
|
||||
rhs = App.make_remote_receiver_component(pin)
|
||||
receiver = Pvariable(config[CONF_ID], rhs)
|
||||
|
||||
for dumper in config[CONF_DUMP]:
|
||||
add(receiver.add_dumper(DUMPERS[dumper].new()))
|
||||
if CONF_TOLERANCE in config:
|
||||
add(receiver.set_tolerance(config[CONF_TOLERANCE]))
|
||||
if CONF_BUFFER_SIZE in config:
|
||||
add(receiver.set_buffer_size(config[CONF_BUFFER_SIZE]))
|
||||
if CONF_FILTER in config:
|
||||
add(receiver.set_filter_us(config[CONF_FILTER]))
|
||||
if CONF_IDLE in config:
|
||||
add(receiver.set_idle_us(config[CONF_IDLE]))
|
||||
|
||||
setup_component(receiver, config)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_REMOTE_RECEIVER'
|
||||
@@ -1,52 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome import automation
|
||||
from esphome.automation import ACTION_REGISTRY, maybe_simple_id
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_ID
|
||||
from esphome.cpp_generator import Pvariable, get_variable
|
||||
from esphome.cpp_types import Action, Trigger, esphome_ns
|
||||
|
||||
Script = esphome_ns.class_('Script', Trigger.template())
|
||||
ScriptExecuteAction = esphome_ns.class_('ScriptExecuteAction', Action)
|
||||
ScriptStopAction = esphome_ns.class_('ScriptStopAction', Action)
|
||||
|
||||
CONFIG_SCHEMA = automation.validate_automation({
|
||||
vol.Required(CONF_ID): cv.declare_variable_id(Script),
|
||||
})
|
||||
|
||||
|
||||
def to_code(config):
|
||||
for conf in config:
|
||||
trigger = Pvariable(conf[CONF_ID], Script.new())
|
||||
automation.build_automations(trigger, [], conf)
|
||||
|
||||
|
||||
CONF_SCRIPT_EXECUTE = 'script.execute'
|
||||
SCRIPT_EXECUTE_ACTION_SCHEMA = maybe_simple_id({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(Script),
|
||||
})
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_SCRIPT_EXECUTE, SCRIPT_EXECUTE_ACTION_SCHEMA)
|
||||
def script_execute_action_to_code(config, action_id, template_arg, args):
|
||||
for var in get_variable(config[CONF_ID]):
|
||||
yield None
|
||||
rhs = var.make_execute_action(template_arg)
|
||||
type = ScriptExecuteAction.template(template_arg)
|
||||
yield Pvariable(action_id, rhs, type=type)
|
||||
|
||||
|
||||
CONF_SCRIPT_STOP = 'script.stop'
|
||||
SCRIPT_STOP_ACTION_SCHEMA = maybe_simple_id({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(Script),
|
||||
})
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_SCRIPT_STOP, SCRIPT_STOP_ACTION_SCHEMA)
|
||||
def script_stop_action_to_code(config, action_id, template_arg, args):
|
||||
for var in get_variable(config[CONF_ID]):
|
||||
yield None
|
||||
rhs = var.make_stop_action(template_arg)
|
||||
type = ScriptStopAction.template(template_arg)
|
||||
yield Pvariable(action_id, rhs, type=type)
|
||||
@@ -1,35 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome.components import sensor
|
||||
from esphome.components.apds9960 import APDS9960, CONF_APDS9960_ID
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_NAME, CONF_TYPE
|
||||
from esphome.cpp_generator import get_variable
|
||||
|
||||
DEPENDENCIES = ['apds9960']
|
||||
|
||||
TYPES = {
|
||||
'CLEAR': 'make_clear_channel',
|
||||
'RED': 'make_red_channel',
|
||||
'GREEN': 'make_green_channel',
|
||||
'BLUE': 'make_blue_channel',
|
||||
'PROXIMITY': 'make_proximity',
|
||||
}
|
||||
|
||||
PLATFORM_SCHEMA = cv.nameable(sensor.SENSOR_PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(sensor.Sensor),
|
||||
vol.Required(CONF_TYPE): cv.one_of(*TYPES, upper=True),
|
||||
cv.GenerateID(CONF_APDS9960_ID): cv.use_variable_id(APDS9960)
|
||||
}))
|
||||
|
||||
|
||||
def to_code(config):
|
||||
for hub in get_variable(config[CONF_APDS9960_ID]):
|
||||
yield
|
||||
func = getattr(hub, TYPES[config[CONF_TYPE]])
|
||||
rhs = func(config[CONF_NAME])
|
||||
sensor.register_sensor(rhs, config)
|
||||
|
||||
|
||||
def to_hass_config(data, config):
|
||||
return sensor.core_to_hass_config(data, config)
|
||||
@@ -1,37 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome.components import sensor
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_ID, CONF_LAMBDA, CONF_NAME, CONF_SENSORS
|
||||
from esphome.cpp_generator import add, process_lambda, variable
|
||||
from esphome.cpp_types import std_vector
|
||||
|
||||
CustomSensorConstructor = sensor.sensor_ns.class_('CustomSensorConstructor')
|
||||
|
||||
PLATFORM_SCHEMA = sensor.PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(CustomSensorConstructor),
|
||||
vol.Required(CONF_LAMBDA): cv.lambda_,
|
||||
vol.Required(CONF_SENSORS): cv.ensure_list(sensor.SENSOR_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(sensor.Sensor),
|
||||
})),
|
||||
})
|
||||
|
||||
|
||||
def to_code(config):
|
||||
for template_ in process_lambda(config[CONF_LAMBDA], [],
|
||||
return_type=std_vector.template(sensor.SensorPtr)):
|
||||
yield
|
||||
|
||||
rhs = CustomSensorConstructor(template_)
|
||||
custom = variable(config[CONF_ID], rhs)
|
||||
for i, conf in enumerate(config[CONF_SENSORS]):
|
||||
rhs = custom.Pget_sensor(i)
|
||||
add(rhs.set_name(conf[CONF_NAME]))
|
||||
sensor.register_sensor(rhs, conf)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_CUSTOM_SENSOR'
|
||||
|
||||
|
||||
def to_hass_config(data, config):
|
||||
return [sensor.core_to_hass_config(data, sens) for sens in config[CONF_SENSORS]]
|
||||
@@ -1,29 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome.components import sensor
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_ENTITY_ID, CONF_ID, CONF_NAME
|
||||
from esphome.cpp_generator import Pvariable
|
||||
from esphome.cpp_types import App
|
||||
|
||||
DEPENDENCIES = ['api']
|
||||
|
||||
HomeassistantSensor = sensor.sensor_ns.class_('HomeassistantSensor', sensor.Sensor)
|
||||
|
||||
PLATFORM_SCHEMA = cv.nameable(sensor.SENSOR_PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(HomeassistantSensor),
|
||||
vol.Required(CONF_ENTITY_ID): cv.entity_id,
|
||||
}))
|
||||
|
||||
|
||||
def to_code(config):
|
||||
rhs = App.make_homeassistant_sensor(config[CONF_NAME], config[CONF_ENTITY_ID])
|
||||
subs = Pvariable(config[CONF_ID], rhs)
|
||||
sensor.setup_sensor(subs, config)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_HOMEASSISTANT_SENSOR'
|
||||
|
||||
|
||||
def to_hass_config(data, config):
|
||||
return sensor.core_to_hass_config(data, config)
|
||||
@@ -1,39 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome import pins
|
||||
from esphome.components import sensor, spi
|
||||
from esphome.components.spi import SPIComponent
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_CS_PIN, CONF_ID, CONF_NAME, CONF_SPI_ID, CONF_UPDATE_INTERVAL
|
||||
from esphome.cpp_generator import Pvariable, get_variable
|
||||
from esphome.cpp_helpers import gpio_output_pin_expression, setup_component
|
||||
from esphome.cpp_types import App
|
||||
|
||||
MAX31855Sensor = sensor.sensor_ns.class_('MAX31855Sensor', sensor.PollingSensorComponent,
|
||||
spi.SPIDevice)
|
||||
|
||||
PLATFORM_SCHEMA = cv.nameable(sensor.SENSOR_PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(MAX31855Sensor),
|
||||
cv.GenerateID(CONF_SPI_ID): cv.use_variable_id(SPIComponent),
|
||||
vol.Required(CONF_CS_PIN): pins.gpio_output_pin_schema,
|
||||
vol.Optional(CONF_UPDATE_INTERVAL): cv.update_interval,
|
||||
}).extend(cv.COMPONENT_SCHEMA.schema))
|
||||
|
||||
|
||||
def to_code(config):
|
||||
for spi_ in get_variable(config[CONF_SPI_ID]):
|
||||
yield
|
||||
for cs in gpio_output_pin_expression(config[CONF_CS_PIN]):
|
||||
yield
|
||||
rhs = App.make_max31855_sensor(config[CONF_NAME], spi_, cs,
|
||||
config.get(CONF_UPDATE_INTERVAL))
|
||||
max31855 = Pvariable(config[CONF_ID], rhs)
|
||||
sensor.setup_sensor(max31855, config)
|
||||
setup_component(max31855, config)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_MAX31855_SENSOR'
|
||||
|
||||
|
||||
def to_hass_config(data, config):
|
||||
return sensor.core_to_hass_config(data, config)
|
||||
@@ -1,76 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome import pins
|
||||
from esphome.components import sensor
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_ID, CONF_NAME, CONF_RESOLUTION, CONF_MIN_VALUE, CONF_MAX_VALUE
|
||||
from esphome.cpp_generator import Pvariable, add
|
||||
from esphome.cpp_helpers import gpio_input_pin_expression, setup_component
|
||||
from esphome.cpp_types import App, Component
|
||||
|
||||
RotaryEncoderResolution = sensor.sensor_ns.enum('RotaryEncoderResolution')
|
||||
RESOLUTIONS = {
|
||||
1: RotaryEncoderResolution.ROTARY_ENCODER_1_PULSE_PER_CYCLE,
|
||||
2: RotaryEncoderResolution.ROTARY_ENCODER_2_PULSES_PER_CYCLE,
|
||||
4: RotaryEncoderResolution.ROTARY_ENCODER_4_PULSES_PER_CYCLE,
|
||||
}
|
||||
|
||||
CONF_PIN_A = 'pin_a'
|
||||
CONF_PIN_B = 'pin_b'
|
||||
CONF_PIN_RESET = 'pin_reset'
|
||||
|
||||
RotaryEncoderSensor = sensor.sensor_ns.class_('RotaryEncoderSensor', sensor.Sensor, Component)
|
||||
|
||||
|
||||
def validate_min_max_value(config):
|
||||
if CONF_MIN_VALUE in config and CONF_MAX_VALUE in config:
|
||||
min_val = config[CONF_MIN_VALUE]
|
||||
max_val = config[CONF_MAX_VALUE]
|
||||
if min_val >= max_val:
|
||||
raise vol.Invalid("Max value {} must be smaller than min value {}"
|
||||
"".format(max_val, min_val))
|
||||
return config
|
||||
|
||||
|
||||
PLATFORM_SCHEMA = cv.nameable(sensor.SENSOR_PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(RotaryEncoderSensor),
|
||||
vol.Required(CONF_PIN_A): vol.All(pins.internal_gpio_input_pin_schema,
|
||||
pins.validate_has_interrupt),
|
||||
vol.Required(CONF_PIN_B): vol.All(pins.internal_gpio_input_pin_schema,
|
||||
pins.validate_has_interrupt),
|
||||
vol.Optional(CONF_PIN_RESET): pins.internal_gpio_input_pin_schema,
|
||||
vol.Optional(CONF_RESOLUTION): cv.one_of(*RESOLUTIONS, int=True),
|
||||
vol.Optional(CONF_MIN_VALUE): cv.int_,
|
||||
vol.Optional(CONF_MAX_VALUE): cv.int_,
|
||||
}).extend(cv.COMPONENT_SCHEMA.schema), validate_min_max_value)
|
||||
|
||||
|
||||
def to_code(config):
|
||||
for pin_a in gpio_input_pin_expression(config[CONF_PIN_A]):
|
||||
yield
|
||||
for pin_b in gpio_input_pin_expression(config[CONF_PIN_B]):
|
||||
yield
|
||||
rhs = App.make_rotary_encoder_sensor(config[CONF_NAME], pin_a, pin_b)
|
||||
encoder = Pvariable(config[CONF_ID], rhs)
|
||||
|
||||
if CONF_PIN_RESET in config:
|
||||
for pin_i in gpio_input_pin_expression(config[CONF_PIN_RESET]):
|
||||
yield
|
||||
add(encoder.set_reset_pin(pin_i))
|
||||
if CONF_RESOLUTION in config:
|
||||
resolution = RESOLUTIONS[config[CONF_RESOLUTION]]
|
||||
add(encoder.set_resolution(resolution))
|
||||
if CONF_MIN_VALUE in config:
|
||||
add(encoder.set_min_value(config[CONF_MIN_VALUE]))
|
||||
if CONF_MAX_VALUE in config:
|
||||
add(encoder.set_min_value(config[CONF_MAX_VALUE]))
|
||||
|
||||
sensor.setup_sensor(encoder, config)
|
||||
setup_component(encoder, config)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_ROTARY_ENCODER_SENSOR'
|
||||
|
||||
|
||||
def to_hass_config(data, config):
|
||||
return sensor.core_to_hass_config(data, config)
|
||||
@@ -1,78 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome.components import sensor, uart
|
||||
from esphome.components.uart import UARTComponent
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import (CONF_ID, CONF_NAME, CONF_PM_10_0, CONF_PM_2_5, CONF_RX_ONLY,
|
||||
CONF_UART_ID, CONF_UPDATE_INTERVAL)
|
||||
from esphome.cpp_generator import Pvariable, add, get_variable
|
||||
from esphome.cpp_helpers import setup_component
|
||||
from esphome.cpp_types import App, Component
|
||||
|
||||
DEPENDENCIES = ['uart']
|
||||
|
||||
SDS011Component = sensor.sensor_ns.class_('SDS011Component', uart.UARTDevice, Component)
|
||||
SDS011Sensor = sensor.sensor_ns.class_('SDS011Sensor', sensor.EmptySensor)
|
||||
|
||||
|
||||
def validate_sds011_rx_mode(value):
|
||||
if CONF_UPDATE_INTERVAL in value and not value.get(CONF_RX_ONLY):
|
||||
update_interval = value[CONF_UPDATE_INTERVAL]
|
||||
if update_interval.total_minutes > 30:
|
||||
raise vol.Invalid("Maximum update interval is 30min")
|
||||
elif value.get(CONF_RX_ONLY) and CONF_UPDATE_INTERVAL in value:
|
||||
# update_interval does not affect anything in rx-only mode, let's warn user about
|
||||
# that
|
||||
raise vol.Invalid("update_interval has no effect in rx_only mode. Please remove it.",
|
||||
path=['update_interval'])
|
||||
return value
|
||||
|
||||
|
||||
SDS011_SENSOR_SCHEMA = sensor.SENSOR_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(SDS011Sensor),
|
||||
})
|
||||
|
||||
PLATFORM_SCHEMA = vol.All(sensor.PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(SDS011Component),
|
||||
cv.GenerateID(CONF_UART_ID): cv.use_variable_id(UARTComponent),
|
||||
|
||||
vol.Optional(CONF_RX_ONLY): cv.boolean,
|
||||
|
||||
vol.Optional(CONF_PM_2_5): cv.nameable(SDS011_SENSOR_SCHEMA),
|
||||
vol.Optional(CONF_PM_10_0): cv.nameable(SDS011_SENSOR_SCHEMA),
|
||||
vol.Optional(CONF_UPDATE_INTERVAL): cv.positive_time_period_minutes,
|
||||
}).extend(cv.COMPONENT_SCHEMA.schema), cv.has_at_least_one_key(CONF_PM_2_5, CONF_PM_10_0),
|
||||
validate_sds011_rx_mode)
|
||||
|
||||
|
||||
def to_code(config):
|
||||
for uart_ in get_variable(config[CONF_UART_ID]):
|
||||
yield
|
||||
|
||||
rhs = App.make_sds011(uart_)
|
||||
sds011 = Pvariable(config[CONF_ID], rhs)
|
||||
|
||||
if CONF_UPDATE_INTERVAL in config:
|
||||
add(sds011.set_update_interval_min(config.get(CONF_UPDATE_INTERVAL)))
|
||||
if CONF_RX_ONLY in config:
|
||||
add(sds011.set_rx_mode_only(config[CONF_RX_ONLY]))
|
||||
|
||||
if CONF_PM_2_5 in config:
|
||||
conf = config[CONF_PM_2_5]
|
||||
sensor.register_sensor(sds011.make_pm_2_5_sensor(conf[CONF_NAME]), conf)
|
||||
if CONF_PM_10_0 in config:
|
||||
conf = config[CONF_PM_10_0]
|
||||
sensor.register_sensor(sds011.make_pm_10_0_sensor(conf[CONF_NAME]), conf)
|
||||
|
||||
setup_component(sds011, config)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_SDS011'
|
||||
|
||||
|
||||
def to_hass_config(data, config):
|
||||
ret = []
|
||||
for key in (CONF_PM_2_5, CONF_PM_10_0):
|
||||
if key in config:
|
||||
ret.append(sensor.core_to_hass_config(data, config[key]))
|
||||
return ret
|
||||
@@ -1,58 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome.automation import ACTION_REGISTRY
|
||||
from esphome.components import sensor
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_ID, CONF_LAMBDA, CONF_NAME, CONF_STATE, CONF_UPDATE_INTERVAL
|
||||
from esphome.cpp_generator import Pvariable, add, get_variable, process_lambda, templatable
|
||||
from esphome.cpp_helpers import setup_component
|
||||
from esphome.cpp_types import Action, App, float_, optional
|
||||
|
||||
TemplateSensor = sensor.sensor_ns.class_('TemplateSensor', sensor.PollingSensorComponent)
|
||||
SensorPublishAction = sensor.sensor_ns.class_('SensorPublishAction', Action)
|
||||
|
||||
PLATFORM_SCHEMA = cv.nameable(sensor.SENSOR_PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(TemplateSensor),
|
||||
vol.Optional(CONF_LAMBDA): cv.lambda_,
|
||||
vol.Optional(CONF_UPDATE_INTERVAL): cv.update_interval,
|
||||
}).extend(cv.COMPONENT_SCHEMA.schema))
|
||||
|
||||
|
||||
def to_code(config):
|
||||
rhs = App.make_template_sensor(config[CONF_NAME], config.get(CONF_UPDATE_INTERVAL))
|
||||
template = Pvariable(config[CONF_ID], rhs)
|
||||
|
||||
sensor.setup_sensor(template, config)
|
||||
setup_component(template, config)
|
||||
|
||||
if CONF_LAMBDA in config:
|
||||
for template_ in process_lambda(config[CONF_LAMBDA], [],
|
||||
return_type=optional.template(float_)):
|
||||
yield
|
||||
add(template.set_template(template_))
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_TEMPLATE_SENSOR'
|
||||
|
||||
CONF_SENSOR_TEMPLATE_PUBLISH = 'sensor.template.publish'
|
||||
SENSOR_TEMPLATE_PUBLISH_ACTION_SCHEMA = cv.Schema({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(sensor.Sensor),
|
||||
vol.Required(CONF_STATE): cv.templatable(cv.float_),
|
||||
})
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_SENSOR_TEMPLATE_PUBLISH, SENSOR_TEMPLATE_PUBLISH_ACTION_SCHEMA)
|
||||
def sensor_template_publish_to_code(config, action_id, template_arg, args):
|
||||
for var in get_variable(config[CONF_ID]):
|
||||
yield None
|
||||
rhs = var.make_sensor_publish_action(template_arg)
|
||||
type = SensorPublishAction.template(template_arg)
|
||||
action = Pvariable(action_id, rhs, type=type)
|
||||
for template_ in templatable(config[CONF_STATE], args, float_):
|
||||
yield None
|
||||
add(action.set_state(template_))
|
||||
yield action
|
||||
|
||||
|
||||
def to_hass_config(data, config):
|
||||
return sensor.core_to_hass_config(data, config)
|
||||
@@ -1,58 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome import pins
|
||||
from esphome.components import sensor
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_ECHO_PIN, CONF_ID, CONF_NAME, CONF_TRIGGER_PIN, \
|
||||
CONF_UPDATE_INTERVAL, CONF_TIMEOUT
|
||||
from esphome.cpp_generator import Pvariable, add
|
||||
from esphome.cpp_helpers import gpio_input_pin_expression, gpio_output_pin_expression, \
|
||||
setup_component
|
||||
from esphome.cpp_types import App
|
||||
|
||||
CONF_PULSE_TIME = 'pulse_time'
|
||||
|
||||
UltrasonicSensorComponent = sensor.sensor_ns.class_('UltrasonicSensorComponent',
|
||||
sensor.PollingSensorComponent)
|
||||
|
||||
|
||||
PLATFORM_SCHEMA = cv.nameable(sensor.SENSOR_PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(UltrasonicSensorComponent),
|
||||
vol.Required(CONF_TRIGGER_PIN): pins.gpio_output_pin_schema,
|
||||
vol.Required(CONF_ECHO_PIN): pins.internal_gpio_input_pin_schema,
|
||||
|
||||
vol.Optional(CONF_TIMEOUT): cv.distance,
|
||||
vol.Optional(CONF_PULSE_TIME): cv.positive_time_period_microseconds,
|
||||
vol.Optional(CONF_UPDATE_INTERVAL): cv.update_interval,
|
||||
|
||||
vol.Optional('timeout_meter'): cv.invalid("The timeout_meter option has been renamed "
|
||||
"to 'timeout'."),
|
||||
vol.Optional('timeout_time'): cv.invalid("The timeout_time option has been removed. Please "
|
||||
"use 'timeout'."),
|
||||
}))
|
||||
|
||||
|
||||
def to_code(config):
|
||||
for trigger in gpio_output_pin_expression(config[CONF_TRIGGER_PIN]):
|
||||
yield
|
||||
for echo in gpio_input_pin_expression(config[CONF_ECHO_PIN]):
|
||||
yield
|
||||
rhs = App.make_ultrasonic_sensor(config[CONF_NAME], trigger, echo,
|
||||
config.get(CONF_UPDATE_INTERVAL))
|
||||
ultrasonic = Pvariable(config[CONF_ID], rhs)
|
||||
|
||||
if CONF_TIMEOUT in config:
|
||||
add(ultrasonic.set_timeout_us(config[CONF_TIMEOUT] / (0.000343 / 2)))
|
||||
|
||||
if CONF_PULSE_TIME in config:
|
||||
add(ultrasonic.set_pulse_time_us(config[CONF_PULSE_TIME]))
|
||||
|
||||
sensor.setup_sensor(ultrasonic, config)
|
||||
setup_component(ultrasonic, config)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_ULTRASONIC_SENSOR'
|
||||
|
||||
|
||||
def to_hass_config(data, config):
|
||||
return sensor.core_to_hass_config(data, config)
|
||||
@@ -1,59 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome.automation import ACTION_REGISTRY
|
||||
from esphome.components.output import FloatOutput
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_ID, CONF_IDLE_LEVEL, CONF_MAX_LEVEL, CONF_MIN_LEVEL, CONF_OUTPUT, \
|
||||
CONF_LEVEL
|
||||
from esphome.cpp_generator import Pvariable, add, get_variable, templatable
|
||||
from esphome.cpp_helpers import setup_component
|
||||
from esphome.cpp_types import App, Component, esphome_ns, Action, float_
|
||||
|
||||
Servo = esphome_ns.class_('Servo', Component)
|
||||
ServoWriteAction = esphome_ns.class_('ServoWriteAction', Action)
|
||||
|
||||
MULTI_CONF = True
|
||||
|
||||
CONFIG_SCHEMA = cv.Schema({
|
||||
vol.Required(CONF_ID): cv.declare_variable_id(Servo),
|
||||
vol.Required(CONF_OUTPUT): cv.use_variable_id(FloatOutput),
|
||||
vol.Optional(CONF_MIN_LEVEL, default='3%'): cv.percentage,
|
||||
vol.Optional(CONF_IDLE_LEVEL, default='7.5%'): cv.percentage,
|
||||
vol.Optional(CONF_MAX_LEVEL, default='12%'): cv.percentage,
|
||||
}).extend(cv.COMPONENT_SCHEMA.schema)
|
||||
|
||||
|
||||
def to_code(config):
|
||||
for out in get_variable(config[CONF_OUTPUT]):
|
||||
yield
|
||||
|
||||
rhs = App.register_component(Servo.new(out))
|
||||
servo = Pvariable(config[CONF_ID], rhs)
|
||||
|
||||
add(servo.set_min_level(config[CONF_MIN_LEVEL]))
|
||||
add(servo.set_idle_level(config[CONF_IDLE_LEVEL]))
|
||||
add(servo.set_max_level(config[CONF_MAX_LEVEL]))
|
||||
|
||||
setup_component(servo, config)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_SERVO'
|
||||
|
||||
CONF_SERVO_WRITE = 'servo.write'
|
||||
SERVO_WRITE_ACTION_SCHEMA = cv.Schema({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(Servo),
|
||||
vol.Required(CONF_LEVEL): cv.templatable(cv.possibly_negative_percentage),
|
||||
})
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_SERVO_WRITE, SERVO_WRITE_ACTION_SCHEMA)
|
||||
def servo_write_to_code(config, action_id, template_arg, args):
|
||||
for var in get_variable(config[CONF_ID]):
|
||||
yield None
|
||||
rhs = ServoWriteAction.new(template_arg, var)
|
||||
type = ServoWriteAction.template(template_arg)
|
||||
action = Pvariable(action_id, rhs, type=type)
|
||||
for template_ in templatable(config[CONF_LEVEL], args, float_):
|
||||
yield None
|
||||
add(action.set_value(template_))
|
||||
yield action
|
||||
@@ -1,40 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome import pins
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_CLK_PIN, CONF_ID, CONF_MISO_PIN, CONF_MOSI_PIN
|
||||
from esphome.cpp_generator import Pvariable, add
|
||||
from esphome.cpp_helpers import gpio_input_pin_expression, gpio_output_pin_expression, \
|
||||
setup_component
|
||||
from esphome.cpp_types import App, Component, esphome_ns
|
||||
|
||||
SPIComponent = esphome_ns.class_('SPIComponent', Component)
|
||||
SPIDevice = esphome_ns.class_('SPIDevice')
|
||||
MULTI_CONF = True
|
||||
|
||||
CONFIG_SCHEMA = vol.All(cv.Schema({
|
||||
cv.GenerateID(): cv.declare_variable_id(SPIComponent),
|
||||
vol.Required(CONF_CLK_PIN): pins.gpio_output_pin_schema,
|
||||
vol.Optional(CONF_MISO_PIN): pins.gpio_input_pin_schema,
|
||||
vol.Optional(CONF_MOSI_PIN): pins.gpio_output_pin_schema,
|
||||
}), cv.has_at_least_one_key(CONF_MISO_PIN, CONF_MOSI_PIN))
|
||||
|
||||
|
||||
def to_code(config):
|
||||
for clk in gpio_output_pin_expression(config[CONF_CLK_PIN]):
|
||||
yield
|
||||
rhs = App.init_spi(clk)
|
||||
spi = Pvariable(config[CONF_ID], rhs)
|
||||
if CONF_MISO_PIN in config:
|
||||
for miso in gpio_input_pin_expression(config[CONF_MISO_PIN]):
|
||||
yield
|
||||
add(spi.set_miso(miso))
|
||||
if CONF_MOSI_PIN in config:
|
||||
for mosi in gpio_input_pin_expression(config[CONF_MOSI_PIN]):
|
||||
yield
|
||||
add(spi.set_mosi(mosi))
|
||||
|
||||
setup_component(spi, config)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_SPI'
|
||||
@@ -1,55 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome import pins
|
||||
from esphome.components import stepper
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_ID, CONF_PIN_A, CONF_PIN_B, CONF_PIN_C, CONF_PIN_D, \
|
||||
CONF_SLEEP_WHEN_DONE, CONF_STEP_MODE
|
||||
from esphome.cpp_generator import Pvariable, add
|
||||
from esphome.cpp_helpers import gpio_output_pin_expression, setup_component
|
||||
from esphome.cpp_types import App, Component
|
||||
|
||||
ULN2003StepMode = stepper.stepper_ns.enum('ULN2003StepMode')
|
||||
|
||||
STEP_MODES = {
|
||||
'FULL_STEP': ULN2003StepMode.ULN2003_STEP_MODE_FULL_STEP,
|
||||
'HALF_STEP': ULN2003StepMode.ULN2003_STEP_MODE_HALF_STEP,
|
||||
'WAVE_DRIVE': ULN2003StepMode.ULN2003_STEP_MODE_WAVE_DRIVE,
|
||||
}
|
||||
|
||||
ULN2003 = stepper.stepper_ns.class_('ULN2003', stepper.Stepper, Component)
|
||||
|
||||
PLATFORM_SCHEMA = stepper.STEPPER_PLATFORM_SCHEMA.extend({
|
||||
vol.Required(CONF_ID): cv.declare_variable_id(ULN2003),
|
||||
vol.Required(CONF_PIN_A): pins.gpio_output_pin_schema,
|
||||
vol.Required(CONF_PIN_B): pins.gpio_output_pin_schema,
|
||||
vol.Required(CONF_PIN_C): pins.gpio_output_pin_schema,
|
||||
vol.Required(CONF_PIN_D): pins.gpio_output_pin_schema,
|
||||
vol.Optional(CONF_SLEEP_WHEN_DONE): cv.boolean,
|
||||
vol.Optional(CONF_STEP_MODE): cv.one_of(*STEP_MODES, upper=True, space='_')
|
||||
}).extend(cv.COMPONENT_SCHEMA.schema)
|
||||
|
||||
|
||||
def to_code(config):
|
||||
for pin_a in gpio_output_pin_expression(config[CONF_PIN_A]):
|
||||
yield
|
||||
for pin_b in gpio_output_pin_expression(config[CONF_PIN_B]):
|
||||
yield
|
||||
for pin_c in gpio_output_pin_expression(config[CONF_PIN_C]):
|
||||
yield
|
||||
for pin_d in gpio_output_pin_expression(config[CONF_PIN_D]):
|
||||
yield
|
||||
rhs = App.make_uln2003(pin_a, pin_b, pin_c, pin_d)
|
||||
uln = Pvariable(config[CONF_ID], rhs)
|
||||
|
||||
if CONF_SLEEP_WHEN_DONE in config:
|
||||
add(uln.set_sleep_when_done(config[CONF_SLEEP_WHEN_DONE]))
|
||||
|
||||
if CONF_STEP_MODE in config:
|
||||
add(uln.set_step_mode(STEP_MODES[config[CONF_STEP_MODE]]))
|
||||
|
||||
stepper.setup_stepper(uln, config)
|
||||
setup_component(uln, config)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_ULN2003'
|
||||
@@ -1,136 +0,0 @@
|
||||
import logging
|
||||
import re
|
||||
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome import core
|
||||
import esphome.config_validation as cv
|
||||
from esphome.core import EsphomeError
|
||||
from esphome.py_compat import string_types
|
||||
|
||||
_LOGGER = logging.getLogger(__name__)
|
||||
|
||||
CONF_SUBSTITUTIONS = 'substitutions'
|
||||
|
||||
VALID_SUBSTITUTIONS_CHARACTERS = 'abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ' \
|
||||
'0123456789_'
|
||||
|
||||
|
||||
def validate_substitution_key(value):
|
||||
value = cv.string(value)
|
||||
if not value:
|
||||
raise vol.Invalid("Substitution key must not be empty")
|
||||
if value[0] == '$':
|
||||
value = value[1:]
|
||||
if value[0].isdigit():
|
||||
raise vol.Invalid("First character in substitutions cannot be a digit.")
|
||||
for char in value:
|
||||
if char not in VALID_SUBSTITUTIONS_CHARACTERS:
|
||||
raise vol.Invalid(
|
||||
u"Substitution must only consist of upper/lowercase characters, the underscore "
|
||||
u"and numbers. The character '{}' cannot be used".format(char))
|
||||
return value
|
||||
|
||||
|
||||
CONFIG_SCHEMA = cv.Schema({
|
||||
validate_substitution_key: cv.string_strict,
|
||||
})
|
||||
|
||||
|
||||
def to_code(config):
|
||||
pass
|
||||
|
||||
|
||||
VARIABLE_PROG = re.compile('\\$([{0}]+|\\{{[{0}]*\\}})'.format(VALID_SUBSTITUTIONS_CHARACTERS))
|
||||
|
||||
|
||||
def _expand_substitutions(substitutions, value, path):
|
||||
if u'$' not in value:
|
||||
return value
|
||||
|
||||
orig_value = value
|
||||
|
||||
i = 0
|
||||
while True:
|
||||
m = VARIABLE_PROG.search(value, i)
|
||||
if not m:
|
||||
# Nothing more to match. Done
|
||||
break
|
||||
|
||||
i, j = m.span(0)
|
||||
name = m.group(1)
|
||||
if name.startswith(u'{') and name.endswith(u'}'):
|
||||
name = name[1:-1]
|
||||
if name not in substitutions:
|
||||
_LOGGER.warning(u"Found '%s' (see %s) which looks like a substitution, but '%s' was "
|
||||
u"not declared", orig_value, u'->'.join(str(x) for x in path), name)
|
||||
i = j
|
||||
continue
|
||||
|
||||
sub = substitutions[name]
|
||||
tail = value[j:]
|
||||
value = value[:i] + sub
|
||||
i = len(value)
|
||||
value += tail
|
||||
return value
|
||||
|
||||
|
||||
def _substitute_item(substitutions, item, path):
|
||||
if isinstance(item, list):
|
||||
for i, it in enumerate(item):
|
||||
sub = _substitute_item(substitutions, it, path + [i])
|
||||
if sub is not None:
|
||||
item[i] = sub
|
||||
elif isinstance(item, dict):
|
||||
replace_keys = []
|
||||
for k, v in item.items():
|
||||
if path or k != CONF_SUBSTITUTIONS:
|
||||
sub = _substitute_item(substitutions, k, path + [k])
|
||||
if sub is not None:
|
||||
replace_keys.append((k, sub))
|
||||
sub = _substitute_item(substitutions, v, path + [k])
|
||||
if sub is not None:
|
||||
item[k] = sub
|
||||
for old, new in replace_keys:
|
||||
item[new] = item[old]
|
||||
del item[old]
|
||||
elif isinstance(item, string_types):
|
||||
sub = _expand_substitutions(substitutions, item, path)
|
||||
if sub != item:
|
||||
return sub
|
||||
elif isinstance(item, core.Lambda):
|
||||
sub = _expand_substitutions(substitutions, item.value, path)
|
||||
if sub != item:
|
||||
item.value = sub
|
||||
return None
|
||||
|
||||
|
||||
def do_substitution_pass(config):
|
||||
if CONF_SUBSTITUTIONS not in config:
|
||||
return config
|
||||
|
||||
substitutions = config[CONF_SUBSTITUTIONS]
|
||||
if not isinstance(substitutions, dict):
|
||||
raise EsphomeError(u"Substitutions must be a key to value mapping, got {}"
|
||||
u"".format(type(substitutions)))
|
||||
|
||||
key = ''
|
||||
try:
|
||||
replace_keys = []
|
||||
for key, value in substitutions.items():
|
||||
sub = validate_substitution_key(key)
|
||||
if sub != key:
|
||||
replace_keys.append((key, sub))
|
||||
substitutions[key] = cv.string_strict(value)
|
||||
for old, new in replace_keys:
|
||||
substitutions[new] = substitutions[old]
|
||||
del substitutions[old]
|
||||
except vol.Invalid as err:
|
||||
err.path.append(key)
|
||||
|
||||
raise EsphomeError(u"Error while parsing substitutions: {}".format(err))
|
||||
|
||||
config[CONF_SUBSTITUTIONS] = substitutions
|
||||
_substitute_item(substitutions, config, [])
|
||||
|
||||
return config
|
||||
@@ -1,162 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome import automation
|
||||
from esphome.automation import ACTION_REGISTRY, CONDITION_REGISTRY, Condition, maybe_simple_id
|
||||
from esphome.components import mqtt
|
||||
from esphome.components.mqtt import setup_mqtt_component
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_ICON, CONF_ID, CONF_INTERNAL, CONF_INVERTED, CONF_MQTT_ID, \
|
||||
CONF_ON_TURN_OFF, CONF_ON_TURN_ON, CONF_OPTIMISTIC, CONF_TRIGGER_ID
|
||||
from esphome.core import CORE
|
||||
from esphome.cpp_generator import Pvariable, add, get_variable
|
||||
from esphome.cpp_types import Action, App, Nameable, Trigger, esphome_ns
|
||||
|
||||
PLATFORM_SCHEMA = cv.PLATFORM_SCHEMA.extend({
|
||||
|
||||
})
|
||||
|
||||
switch_ns = esphome_ns.namespace('switch_')
|
||||
Switch = switch_ns.class_('Switch', Nameable)
|
||||
SwitchPtr = Switch.operator('ptr')
|
||||
MQTTSwitchComponent = switch_ns.class_('MQTTSwitchComponent', mqtt.MQTTComponent)
|
||||
|
||||
ToggleAction = switch_ns.class_('ToggleAction', Action)
|
||||
TurnOffAction = switch_ns.class_('TurnOffAction', Action)
|
||||
TurnOnAction = switch_ns.class_('TurnOnAction', Action)
|
||||
|
||||
SwitchCondition = switch_ns.class_('SwitchCondition', Condition)
|
||||
SwitchTurnOnTrigger = switch_ns.class_('SwitchTurnOnTrigger', Trigger.template())
|
||||
SwitchTurnOffTrigger = switch_ns.class_('SwitchTurnOffTrigger', Trigger.template())
|
||||
|
||||
SWITCH_SCHEMA = cv.MQTT_COMMAND_COMPONENT_SCHEMA.extend({
|
||||
cv.GenerateID(CONF_MQTT_ID): cv.declare_variable_id(MQTTSwitchComponent),
|
||||
vol.Optional(CONF_ICON): cv.icon,
|
||||
vol.Optional(CONF_INVERTED): cv.boolean,
|
||||
vol.Optional(CONF_ON_TURN_ON): automation.validate_automation({
|
||||
cv.GenerateID(CONF_TRIGGER_ID): cv.declare_variable_id(SwitchTurnOnTrigger),
|
||||
}),
|
||||
vol.Optional(CONF_ON_TURN_OFF): automation.validate_automation({
|
||||
cv.GenerateID(CONF_TRIGGER_ID): cv.declare_variable_id(SwitchTurnOffTrigger),
|
||||
}),
|
||||
})
|
||||
|
||||
SWITCH_PLATFORM_SCHEMA = PLATFORM_SCHEMA.extend(SWITCH_SCHEMA.schema)
|
||||
|
||||
|
||||
def setup_switch_core_(switch_var, config):
|
||||
if CONF_INTERNAL in config:
|
||||
add(switch_var.set_internal(config[CONF_INTERNAL]))
|
||||
if CONF_ICON in config:
|
||||
add(switch_var.set_icon(config[CONF_ICON]))
|
||||
if CONF_INVERTED in config:
|
||||
add(switch_var.set_inverted(config[CONF_INVERTED]))
|
||||
for conf in config.get(CONF_ON_TURN_ON, []):
|
||||
rhs = switch_var.make_switch_turn_on_trigger()
|
||||
trigger = Pvariable(conf[CONF_TRIGGER_ID], rhs)
|
||||
automation.build_automations(trigger, [], conf)
|
||||
for conf in config.get(CONF_ON_TURN_OFF, []):
|
||||
rhs = switch_var.make_switch_turn_off_trigger()
|
||||
trigger = Pvariable(conf[CONF_TRIGGER_ID], rhs)
|
||||
automation.build_automations(trigger, [], conf)
|
||||
|
||||
setup_mqtt_component(switch_var.Pget_mqtt(), config)
|
||||
|
||||
|
||||
def setup_switch(switch_obj, config):
|
||||
if not CORE.has_id(config[CONF_ID]):
|
||||
switch_obj = Pvariable(config[CONF_ID], switch_obj, has_side_effects=True)
|
||||
CORE.add_job(setup_switch_core_, switch_obj, config)
|
||||
|
||||
|
||||
def register_switch(var, config):
|
||||
switch_var = Pvariable(config[CONF_ID], var, has_side_effects=True)
|
||||
add(App.register_switch(switch_var))
|
||||
CORE.add_job(setup_switch_core_, switch_var, config)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_SWITCH'
|
||||
|
||||
CONF_SWITCH_TOGGLE = 'switch.toggle'
|
||||
SWITCH_TOGGLE_ACTION_SCHEMA = maybe_simple_id({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(Switch),
|
||||
})
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_SWITCH_TOGGLE, SWITCH_TOGGLE_ACTION_SCHEMA)
|
||||
def switch_toggle_to_code(config, action_id, template_arg, args):
|
||||
for var in get_variable(config[CONF_ID]):
|
||||
yield None
|
||||
rhs = var.make_toggle_action(template_arg)
|
||||
type = ToggleAction.template(template_arg)
|
||||
yield Pvariable(action_id, rhs, type=type)
|
||||
|
||||
|
||||
CONF_SWITCH_TURN_OFF = 'switch.turn_off'
|
||||
SWITCH_TURN_OFF_ACTION_SCHEMA = maybe_simple_id({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(Switch),
|
||||
})
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_SWITCH_TURN_OFF, SWITCH_TURN_OFF_ACTION_SCHEMA)
|
||||
def switch_turn_off_to_code(config, action_id, template_arg, args):
|
||||
for var in get_variable(config[CONF_ID]):
|
||||
yield None
|
||||
rhs = var.make_turn_off_action(template_arg)
|
||||
type = TurnOffAction.template(template_arg)
|
||||
yield Pvariable(action_id, rhs, type=type)
|
||||
|
||||
|
||||
CONF_SWITCH_TURN_ON = 'switch.turn_on'
|
||||
SWITCH_TURN_ON_ACTION_SCHEMA = maybe_simple_id({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(Switch),
|
||||
})
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_SWITCH_TURN_ON, SWITCH_TURN_ON_ACTION_SCHEMA)
|
||||
def switch_turn_on_to_code(config, action_id, template_arg, args):
|
||||
for var in get_variable(config[CONF_ID]):
|
||||
yield None
|
||||
rhs = var.make_turn_on_action(template_arg)
|
||||
type = TurnOnAction.template(template_arg)
|
||||
yield Pvariable(action_id, rhs, type=type)
|
||||
|
||||
|
||||
CONF_SWITCH_IS_ON = 'switch.is_on'
|
||||
SWITCH_IS_ON_CONDITION_SCHEMA = maybe_simple_id({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(Switch),
|
||||
})
|
||||
|
||||
|
||||
@CONDITION_REGISTRY.register(CONF_SWITCH_IS_ON, SWITCH_IS_ON_CONDITION_SCHEMA)
|
||||
def switch_is_on_to_code(config, condition_id, template_arg, args):
|
||||
for var in get_variable(config[CONF_ID]):
|
||||
yield None
|
||||
rhs = var.make_switch_is_on_condition(template_arg)
|
||||
type = SwitchCondition.template(template_arg)
|
||||
yield Pvariable(condition_id, rhs, type=type)
|
||||
|
||||
|
||||
CONF_SWITCH_IS_OFF = 'switch.is_off'
|
||||
SWITCH_IS_OFF_CONDITION_SCHEMA = maybe_simple_id({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(Switch),
|
||||
})
|
||||
|
||||
|
||||
@CONDITION_REGISTRY.register(CONF_SWITCH_IS_OFF, SWITCH_IS_OFF_CONDITION_SCHEMA)
|
||||
def switch_is_off_to_code(config, condition_id, template_arg, args):
|
||||
for var in get_variable(config[CONF_ID]):
|
||||
yield None
|
||||
rhs = var.make_switch_is_off_condition(template_arg)
|
||||
type = SwitchCondition.template(template_arg)
|
||||
yield Pvariable(condition_id, rhs, type=type)
|
||||
|
||||
|
||||
def core_to_hass_config(data, config):
|
||||
ret = mqtt.build_hass_config(data, 'switch', config, include_state=True, include_command=True)
|
||||
if ret is None:
|
||||
return None
|
||||
if CONF_ICON in config:
|
||||
ret['icon'] = config[CONF_ICON]
|
||||
if CONF_OPTIMISTIC in config:
|
||||
ret['optimistic'] = config[CONF_OPTIMISTIC]
|
||||
return ret
|
||||
@@ -1,38 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome.components import switch
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_ID, CONF_LAMBDA, CONF_NAME, CONF_SWITCHES
|
||||
from esphome.cpp_generator import add, process_lambda, variable
|
||||
from esphome.cpp_types import std_vector
|
||||
|
||||
CustomSwitchConstructor = switch.switch_ns.class_('CustomSwitchConstructor')
|
||||
|
||||
PLATFORM_SCHEMA = switch.PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(CustomSwitchConstructor),
|
||||
vol.Required(CONF_LAMBDA): cv.lambda_,
|
||||
vol.Required(CONF_SWITCHES):
|
||||
cv.ensure_list(switch.SWITCH_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(switch.Switch),
|
||||
})),
|
||||
})
|
||||
|
||||
|
||||
def to_code(config):
|
||||
for template_ in process_lambda(config[CONF_LAMBDA], [],
|
||||
return_type=std_vector.template(switch.SwitchPtr)):
|
||||
yield
|
||||
|
||||
rhs = CustomSwitchConstructor(template_)
|
||||
custom = variable(config[CONF_ID], rhs)
|
||||
for i, conf in enumerate(config[CONF_SWITCHES]):
|
||||
rhs = custom.Pget_switch(i)
|
||||
add(rhs.set_name(conf[CONF_NAME]))
|
||||
switch.register_switch(rhs, conf)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_CUSTOM_SWITCH'
|
||||
|
||||
|
||||
def to_hass_config(data, config):
|
||||
return [switch.core_to_hass_config(data, swi) for swi in config[CONF_SWITCHES]]
|
||||
@@ -1,54 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome import pins
|
||||
from esphome.components import switch
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_ID, CONF_INTERLOCK, CONF_NAME, CONF_PIN, CONF_RESTORE_MODE
|
||||
from esphome.cpp_generator import Pvariable, add, get_variable
|
||||
from esphome.cpp_helpers import gpio_output_pin_expression, setup_component
|
||||
from esphome.cpp_types import App, Component
|
||||
|
||||
GPIOSwitch = switch.switch_ns.class_('GPIOSwitch', switch.Switch, Component)
|
||||
GPIOSwitchRestoreMode = switch.switch_ns.enum('GPIOSwitchRestoreMode')
|
||||
|
||||
RESTORE_MODES = {
|
||||
'RESTORE_DEFAULT_OFF': GPIOSwitchRestoreMode.GPIO_SWITCH_RESTORE_DEFAULT_OFF,
|
||||
'RESTORE_DEFAULT_ON': GPIOSwitchRestoreMode.GPIO_SWITCH_RESTORE_DEFAULT_ON,
|
||||
'ALWAYS_OFF': GPIOSwitchRestoreMode.GPIO_SWITCH_ALWAYS_OFF,
|
||||
'ALWAYS_ON': GPIOSwitchRestoreMode.GPIO_SWITCH_ALWAYS_ON,
|
||||
}
|
||||
|
||||
PLATFORM_SCHEMA = cv.nameable(switch.SWITCH_PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(GPIOSwitch),
|
||||
vol.Required(CONF_PIN): pins.gpio_output_pin_schema,
|
||||
vol.Optional(CONF_RESTORE_MODE): cv.one_of(*RESTORE_MODES, upper=True, space='_'),
|
||||
vol.Optional(CONF_INTERLOCK): cv.ensure_list(cv.use_variable_id(switch.Switch)),
|
||||
}).extend(cv.COMPONENT_SCHEMA.schema))
|
||||
|
||||
|
||||
def to_code(config):
|
||||
for pin in gpio_output_pin_expression(config[CONF_PIN]):
|
||||
yield
|
||||
rhs = App.make_gpio_switch(config[CONF_NAME], pin)
|
||||
gpio = Pvariable(config[CONF_ID], rhs)
|
||||
|
||||
if CONF_RESTORE_MODE in config:
|
||||
add(gpio.set_restore_mode(RESTORE_MODES[config[CONF_RESTORE_MODE]]))
|
||||
|
||||
if CONF_INTERLOCK in config:
|
||||
interlock = []
|
||||
for it in config[CONF_INTERLOCK]:
|
||||
for lock in get_variable(it):
|
||||
yield
|
||||
interlock.append(lock)
|
||||
add(gpio.set_interlock(interlock))
|
||||
|
||||
switch.setup_switch(gpio, config)
|
||||
setup_component(gpio, config)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_GPIO_SWITCH'
|
||||
|
||||
|
||||
def to_hass_config(data, config):
|
||||
return switch.core_to_hass_config(data, config)
|
||||
@@ -1,78 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome import automation
|
||||
from esphome.automation import ACTION_REGISTRY
|
||||
from esphome.components import switch
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_ASSUMED_STATE, CONF_ID, CONF_LAMBDA, CONF_NAME, CONF_OPTIMISTIC, \
|
||||
CONF_RESTORE_STATE, CONF_STATE, CONF_TURN_OFF_ACTION, CONF_TURN_ON_ACTION
|
||||
from esphome.cpp_generator import Pvariable, add, get_variable, process_lambda, templatable
|
||||
from esphome.cpp_helpers import setup_component
|
||||
from esphome.cpp_types import Action, App, Component, bool_, optional
|
||||
|
||||
TemplateSwitch = switch.switch_ns.class_('TemplateSwitch', switch.Switch, Component)
|
||||
SwitchPublishAction = switch.switch_ns.class_('SwitchPublishAction', Action)
|
||||
|
||||
PLATFORM_SCHEMA = cv.nameable(switch.SWITCH_PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(TemplateSwitch),
|
||||
vol.Optional(CONF_LAMBDA): cv.lambda_,
|
||||
vol.Optional(CONF_OPTIMISTIC): cv.boolean,
|
||||
vol.Optional(CONF_ASSUMED_STATE): cv.boolean,
|
||||
vol.Optional(CONF_TURN_OFF_ACTION): automation.validate_automation(single=True),
|
||||
vol.Optional(CONF_TURN_ON_ACTION): automation.validate_automation(single=True),
|
||||
vol.Optional(CONF_RESTORE_STATE): cv.boolean,
|
||||
}).extend(cv.COMPONENT_SCHEMA.schema))
|
||||
|
||||
|
||||
def to_code(config):
|
||||
rhs = App.make_template_switch(config[CONF_NAME])
|
||||
template = Pvariable(config[CONF_ID], rhs)
|
||||
|
||||
switch.setup_switch(template, config)
|
||||
|
||||
if CONF_LAMBDA in config:
|
||||
for template_ in process_lambda(config[CONF_LAMBDA], [],
|
||||
return_type=optional.template(bool_)):
|
||||
yield
|
||||
add(template.set_state_lambda(template_))
|
||||
if CONF_TURN_OFF_ACTION in config:
|
||||
automation.build_automations(template.get_turn_off_trigger(), [],
|
||||
config[CONF_TURN_OFF_ACTION])
|
||||
if CONF_TURN_ON_ACTION in config:
|
||||
automation.build_automations(template.get_turn_on_trigger(), [],
|
||||
config[CONF_TURN_ON_ACTION])
|
||||
if CONF_OPTIMISTIC in config:
|
||||
add(template.set_optimistic(config[CONF_OPTIMISTIC]))
|
||||
if CONF_ASSUMED_STATE in config:
|
||||
add(template.set_assumed_state(config[CONF_ASSUMED_STATE]))
|
||||
|
||||
if CONF_RESTORE_STATE in config:
|
||||
add(template.set_restore_state(config[CONF_RESTORE_STATE]))
|
||||
|
||||
setup_component(template, config)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_TEMPLATE_SWITCH'
|
||||
|
||||
CONF_SWITCH_TEMPLATE_PUBLISH = 'switch.template.publish'
|
||||
SWITCH_TEMPLATE_PUBLISH_ACTION_SCHEMA = cv.Schema({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(switch.Switch),
|
||||
vol.Required(CONF_STATE): cv.templatable(cv.boolean),
|
||||
})
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_SWITCH_TEMPLATE_PUBLISH, SWITCH_TEMPLATE_PUBLISH_ACTION_SCHEMA)
|
||||
def switch_template_publish_to_code(config, action_id, template_arg, args):
|
||||
for var in get_variable(config[CONF_ID]):
|
||||
yield None
|
||||
rhs = var.make_switch_publish_action(template_arg)
|
||||
type = SwitchPublishAction.template(template_arg)
|
||||
action = Pvariable(action_id, rhs, type=type)
|
||||
for template_ in templatable(config[CONF_STATE], args, bool_):
|
||||
yield None
|
||||
add(action.set_state(template_))
|
||||
yield action
|
||||
|
||||
|
||||
def to_hass_config(data, config):
|
||||
return switch.core_to_hass_config(data, config)
|
||||
@@ -1,38 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome.components import text_sensor
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_ID, CONF_LAMBDA, CONF_NAME, CONF_TEXT_SENSORS
|
||||
from esphome.cpp_generator import add, process_lambda, variable
|
||||
from esphome.cpp_types import std_vector
|
||||
|
||||
CustomTextSensorConstructor = text_sensor.text_sensor_ns.class_('CustomTextSensorConstructor')
|
||||
|
||||
PLATFORM_SCHEMA = text_sensor.PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(CustomTextSensorConstructor),
|
||||
vol.Required(CONF_LAMBDA): cv.lambda_,
|
||||
vol.Required(CONF_TEXT_SENSORS):
|
||||
cv.ensure_list(text_sensor.TEXT_SENSOR_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(text_sensor.TextSensor),
|
||||
})),
|
||||
})
|
||||
|
||||
|
||||
def to_code(config):
|
||||
for template_ in process_lambda(config[CONF_LAMBDA], [],
|
||||
return_type=std_vector.template(text_sensor.TextSensorPtr)):
|
||||
yield
|
||||
|
||||
rhs = CustomTextSensorConstructor(template_)
|
||||
custom = variable(config[CONF_ID], rhs)
|
||||
for i, conf in enumerate(config[CONF_TEXT_SENSORS]):
|
||||
rhs = custom.Pget_text_sensor(i)
|
||||
add(rhs.set_name(conf[CONF_NAME]))
|
||||
text_sensor.register_text_sensor(rhs, conf)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_CUSTOM_TEXT_SENSOR'
|
||||
|
||||
|
||||
def to_hass_config(data, config):
|
||||
return [text_sensor.core_to_hass_config(data, sens) for sens in config[CONF_TEXT_SENSORS]]
|
||||
@@ -1,30 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome.components import text_sensor
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_ENTITY_ID, CONF_ID, CONF_NAME
|
||||
from esphome.cpp_generator import Pvariable
|
||||
from esphome.cpp_types import App, Component
|
||||
|
||||
DEPENDENCIES = ['api']
|
||||
|
||||
HomeassistantTextSensor = text_sensor.text_sensor_ns.class_('HomeassistantTextSensor',
|
||||
text_sensor.TextSensor, Component)
|
||||
|
||||
PLATFORM_SCHEMA = cv.nameable(text_sensor.TEXT_SENSOR_PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(HomeassistantTextSensor),
|
||||
vol.Required(CONF_ENTITY_ID): cv.entity_id,
|
||||
}))
|
||||
|
||||
|
||||
def to_code(config):
|
||||
rhs = App.make_homeassistant_text_sensor(config[CONF_NAME], config[CONF_ENTITY_ID])
|
||||
sensor_ = Pvariable(config[CONF_ID], rhs)
|
||||
text_sensor.setup_text_sensor(sensor_, config)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_HOMEASSISTANT_TEXT_SENSOR'
|
||||
|
||||
|
||||
def to_hass_config(data, config):
|
||||
return text_sensor.core_to_hass_config(data, config)
|
||||
@@ -1,59 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome.automation import ACTION_REGISTRY
|
||||
from esphome.components import text_sensor
|
||||
from esphome.components.text_sensor import TextSensorPublishAction
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_ID, CONF_LAMBDA, CONF_NAME, CONF_STATE, CONF_UPDATE_INTERVAL
|
||||
from esphome.cpp_generator import Pvariable, add, get_variable, process_lambda, templatable
|
||||
from esphome.cpp_helpers import setup_component
|
||||
from esphome.cpp_types import App, PollingComponent, optional, std_string
|
||||
|
||||
TemplateTextSensor = text_sensor.text_sensor_ns.class_('TemplateTextSensor',
|
||||
text_sensor.TextSensor, PollingComponent)
|
||||
|
||||
PLATFORM_SCHEMA = cv.nameable(text_sensor.TEXT_SENSOR_PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(TemplateTextSensor),
|
||||
vol.Optional(CONF_LAMBDA): cv.lambda_,
|
||||
vol.Optional(CONF_UPDATE_INTERVAL): cv.update_interval,
|
||||
}).extend(cv.COMPONENT_SCHEMA.schema))
|
||||
|
||||
|
||||
def to_code(config):
|
||||
rhs = App.make_template_text_sensor(config[CONF_NAME], config.get(CONF_UPDATE_INTERVAL))
|
||||
template = Pvariable(config[CONF_ID], rhs)
|
||||
text_sensor.setup_text_sensor(template, config)
|
||||
setup_component(template, config)
|
||||
|
||||
if CONF_LAMBDA in config:
|
||||
for template_ in process_lambda(config[CONF_LAMBDA], [],
|
||||
return_type=optional.template(std_string)):
|
||||
yield
|
||||
add(template.set_template(template_))
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_TEMPLATE_TEXT_SENSOR'
|
||||
|
||||
CONF_TEXT_SENSOR_TEMPLATE_PUBLISH = 'text_sensor.template.publish'
|
||||
TEXT_SENSOR_TEMPLATE_PUBLISH_ACTION_SCHEMA = cv.Schema({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(text_sensor.TextSensor),
|
||||
vol.Required(CONF_STATE): cv.templatable(cv.string_strict),
|
||||
})
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_TEXT_SENSOR_TEMPLATE_PUBLISH,
|
||||
TEXT_SENSOR_TEMPLATE_PUBLISH_ACTION_SCHEMA)
|
||||
def text_sensor_template_publish_to_code(config, action_id, template_arg, args):
|
||||
for var in get_variable(config[CONF_ID]):
|
||||
yield None
|
||||
rhs = var.make_text_sensor_publish_action(template_arg)
|
||||
type = TextSensorPublishAction.template(template_arg)
|
||||
action = Pvariable(action_id, rhs, type=type)
|
||||
for template_ in templatable(config[CONF_STATE], args, std_string):
|
||||
yield None
|
||||
add(action.set_state(template_))
|
||||
yield action
|
||||
|
||||
|
||||
def to_hass_config(data, config):
|
||||
return text_sensor.core_to_hass_config(data, config)
|
||||
@@ -1,45 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome.components import text_sensor
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_BSSID, CONF_ID, CONF_IP_ADDRESS, CONF_NAME, CONF_SSID
|
||||
from esphome.cpp_generator import Pvariable
|
||||
from esphome.cpp_types import App, Component
|
||||
|
||||
DEPENDENCIES = ['wifi']
|
||||
|
||||
IPAddressWiFiInfo = text_sensor.text_sensor_ns.class_('IPAddressWiFiInfo',
|
||||
text_sensor.TextSensor, Component)
|
||||
SSIDWiFiInfo = text_sensor.text_sensor_ns.class_('SSIDWiFiInfo',
|
||||
text_sensor.TextSensor, Component)
|
||||
BSSIDWiFiInfo = text_sensor.text_sensor_ns.class_('BSSIDWiFiInfo',
|
||||
text_sensor.TextSensor, Component)
|
||||
|
||||
PLATFORM_SCHEMA = text_sensor.PLATFORM_SCHEMA.extend({
|
||||
vol.Optional(CONF_IP_ADDRESS): cv.nameable(text_sensor.TEXT_SENSOR_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(IPAddressWiFiInfo),
|
||||
})),
|
||||
vol.Optional(CONF_SSID): cv.nameable(text_sensor.TEXT_SENSOR_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(SSIDWiFiInfo),
|
||||
})),
|
||||
vol.Optional(CONF_BSSID): cv.nameable(text_sensor.TEXT_SENSOR_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(BSSIDWiFiInfo),
|
||||
})),
|
||||
})
|
||||
|
||||
|
||||
def setup_conf(config, key, klass):
|
||||
if key in config:
|
||||
conf = config[key]
|
||||
rhs = App.register_component(klass.new(conf[CONF_NAME]))
|
||||
sensor_ = Pvariable(conf[CONF_ID], rhs)
|
||||
text_sensor.register_text_sensor(sensor_, conf)
|
||||
|
||||
|
||||
def to_code(config):
|
||||
setup_conf(config, CONF_IP_ADDRESS, IPAddressWiFiInfo)
|
||||
setup_conf(config, CONF_SSID, SSIDWiFiInfo)
|
||||
setup_conf(config, CONF_BSSID, BSSIDWiFiInfo)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_WIFI_INFO_TEXT_SENSOR'
|
||||
@@ -1,24 +0,0 @@
|
||||
from esphome.components import time as time_
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_ID
|
||||
from esphome.cpp_generator import Pvariable
|
||||
from esphome.cpp_helpers import setup_component
|
||||
from esphome.cpp_types import App
|
||||
|
||||
DEPENDENCIES = ['api']
|
||||
|
||||
HomeAssistantTime = time_.time_ns.class_('HomeAssistantTime', time_.RealTimeClockComponent)
|
||||
|
||||
PLATFORM_SCHEMA = time_.TIME_PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(HomeAssistantTime),
|
||||
}).extend(cv.COMPONENT_SCHEMA.schema)
|
||||
|
||||
|
||||
def to_code(config):
|
||||
rhs = App.make_homeassistant_time_component()
|
||||
ha_time = Pvariable(config[CONF_ID], rhs)
|
||||
time_.setup_time(ha_time, config)
|
||||
setup_component(ha_time, config)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_HOMEASSISTANT_TIME'
|
||||
@@ -1,43 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome import pins
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_BAUD_RATE, CONF_ID, CONF_RX_PIN, CONF_TX_PIN
|
||||
from esphome.core import CORE
|
||||
from esphome.cpp_generator import Pvariable, add
|
||||
from esphome.cpp_helpers import setup_component
|
||||
from esphome.cpp_types import App, Component, esphome_ns
|
||||
|
||||
UARTComponent = esphome_ns.class_('UARTComponent', Component)
|
||||
UARTDevice = esphome_ns.class_('UARTDevice')
|
||||
MULTI_CONF = True
|
||||
|
||||
|
||||
def validate_rx_pin(value):
|
||||
value = pins.input_pin(value)
|
||||
if CORE.is_esp8266 and value >= 16:
|
||||
raise vol.Invalid("Pins GPIO16 and GPIO17 cannot be used as RX pins on ESP8266.")
|
||||
return value
|
||||
|
||||
|
||||
CONFIG_SCHEMA = vol.All(cv.Schema({
|
||||
cv.GenerateID(): cv.declare_variable_id(UARTComponent),
|
||||
vol.Optional(CONF_TX_PIN): pins.output_pin,
|
||||
vol.Optional(CONF_RX_PIN): validate_rx_pin,
|
||||
vol.Required(CONF_BAUD_RATE): cv.positive_int,
|
||||
}).extend(cv.COMPONENT_SCHEMA.schema), cv.has_at_least_one_key(CONF_TX_PIN, CONF_RX_PIN))
|
||||
|
||||
|
||||
def to_code(config):
|
||||
rhs = App.init_uart(config[CONF_BAUD_RATE])
|
||||
var = Pvariable(config[CONF_ID], rhs)
|
||||
|
||||
if CONF_TX_PIN in config:
|
||||
add(var.set_tx_pin(config[CONF_TX_PIN]))
|
||||
if CONF_RX_PIN in config:
|
||||
add(var.set_rx_pin(config[CONF_RX_PIN]))
|
||||
|
||||
setup_component(var, config)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_UART'
|
||||
@@ -1,203 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome.automation import CONDITION_REGISTRY, Condition
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_AP, CONF_BSSID, CONF_CHANNEL, CONF_DNS1, CONF_DNS2, CONF_DOMAIN, \
|
||||
CONF_FAST_CONNECT, CONF_GATEWAY, CONF_HIDDEN, CONF_ID, CONF_MANUAL_IP, CONF_NETWORKS, \
|
||||
CONF_PASSWORD, CONF_POWER_SAVE_MODE, CONF_REBOOT_TIMEOUT, CONF_SSID, CONF_STATIC_IP, \
|
||||
CONF_SUBNET, CONF_USE_ADDRESS
|
||||
from esphome.core import CORE, HexInt
|
||||
from esphome.cpp_generator import Pvariable, StructInitializer, add, variable
|
||||
from esphome.cpp_types import App, Component, esphome_ns, global_ns
|
||||
|
||||
IPAddress = global_ns.class_('IPAddress')
|
||||
ManualIP = esphome_ns.struct('ManualIP')
|
||||
WiFiComponent = esphome_ns.class_('WiFiComponent', Component)
|
||||
WiFiAP = esphome_ns.struct('WiFiAP')
|
||||
|
||||
WiFiPowerSaveMode = esphome_ns.enum('WiFiPowerSaveMode')
|
||||
WIFI_POWER_SAVE_MODES = {
|
||||
'NONE': WiFiPowerSaveMode.WIFI_POWER_SAVE_NONE,
|
||||
'LIGHT': WiFiPowerSaveMode.WIFI_POWER_SAVE_LIGHT,
|
||||
'HIGH': WiFiPowerSaveMode.WIFI_POWER_SAVE_HIGH,
|
||||
}
|
||||
WiFiConnectedCondition = esphome_ns.class_('WiFiConnectedCondition', Condition)
|
||||
|
||||
|
||||
def validate_password(value):
|
||||
value = cv.string_strict(value)
|
||||
if not value:
|
||||
return value
|
||||
if len(value) < 8:
|
||||
raise vol.Invalid(u"WPA password must be at least 8 characters long")
|
||||
if len(value) > 64:
|
||||
raise vol.Invalid(u"WPA password must be at most 64 characters long")
|
||||
return value
|
||||
|
||||
|
||||
def validate_channel(value):
|
||||
value = cv.positive_int(value)
|
||||
if value < 1:
|
||||
raise vol.Invalid("Minimum WiFi channel is 1")
|
||||
if value > 14:
|
||||
raise vol.Invalid("Maximum WiFi channel is 14")
|
||||
return value
|
||||
|
||||
|
||||
AP_MANUAL_IP_SCHEMA = cv.Schema({
|
||||
vol.Required(CONF_STATIC_IP): cv.ipv4,
|
||||
vol.Required(CONF_GATEWAY): cv.ipv4,
|
||||
vol.Required(CONF_SUBNET): cv.ipv4,
|
||||
})
|
||||
|
||||
STA_MANUAL_IP_SCHEMA = AP_MANUAL_IP_SCHEMA.extend({
|
||||
vol.Optional(CONF_DNS1, default="1.1.1.1"): cv.ipv4,
|
||||
vol.Optional(CONF_DNS2, default="1.0.0.1"): cv.ipv4,
|
||||
})
|
||||
|
||||
WIFI_NETWORK_BASE = cv.Schema({
|
||||
cv.GenerateID(): cv.declare_variable_id(WiFiAP),
|
||||
vol.Optional(CONF_SSID): cv.ssid,
|
||||
vol.Optional(CONF_PASSWORD): validate_password,
|
||||
vol.Optional(CONF_CHANNEL): validate_channel,
|
||||
vol.Optional(CONF_MANUAL_IP): STA_MANUAL_IP_SCHEMA,
|
||||
})
|
||||
|
||||
WIFI_NETWORK_AP = WIFI_NETWORK_BASE.extend({
|
||||
|
||||
})
|
||||
|
||||
WIFI_NETWORK_STA = WIFI_NETWORK_BASE.extend({
|
||||
vol.Optional(CONF_BSSID): cv.mac_address,
|
||||
vol.Optional(CONF_HIDDEN): cv.boolean,
|
||||
})
|
||||
|
||||
|
||||
def validate(config):
|
||||
if CONF_PASSWORD in config and CONF_SSID not in config:
|
||||
raise vol.Invalid("Cannot have WiFi password without SSID!")
|
||||
|
||||
if CONF_SSID in config:
|
||||
network = {CONF_SSID: config.pop(CONF_SSID)}
|
||||
if CONF_PASSWORD in config:
|
||||
network[CONF_PASSWORD] = config.pop(CONF_PASSWORD)
|
||||
if CONF_NETWORKS in config:
|
||||
raise vol.Invalid("You cannot use the 'ssid:' option together with 'networks:'. Please "
|
||||
"copy your network into the 'networks:' key")
|
||||
config[CONF_NETWORKS] = cv.ensure_list(WIFI_NETWORK_STA)(network)
|
||||
|
||||
if (CONF_NETWORKS not in config) and (CONF_AP not in config):
|
||||
raise vol.Invalid("Please specify at least an SSID or an Access Point "
|
||||
"to create.")
|
||||
|
||||
if config.get(CONF_FAST_CONNECT, False):
|
||||
networks = config.get(CONF_NETWORKS, [])
|
||||
if not networks:
|
||||
raise vol.Invalid("At least one network required for fast_connect!")
|
||||
if len(networks) != 1:
|
||||
raise vol.Invalid("Fast connect can only be used with one network!")
|
||||
|
||||
if CONF_USE_ADDRESS not in config:
|
||||
if CONF_MANUAL_IP in config:
|
||||
use_address = str(config[CONF_MANUAL_IP][CONF_STATIC_IP])
|
||||
else:
|
||||
use_address = CORE.name + config[CONF_DOMAIN]
|
||||
config[CONF_USE_ADDRESS] = use_address
|
||||
|
||||
return config
|
||||
|
||||
|
||||
CONFIG_SCHEMA = vol.All(cv.Schema({
|
||||
cv.GenerateID(): cv.declare_variable_id(WiFiComponent),
|
||||
vol.Optional(CONF_NETWORKS): cv.ensure_list(WIFI_NETWORK_STA),
|
||||
|
||||
vol.Optional(CONF_SSID): cv.ssid,
|
||||
vol.Optional(CONF_PASSWORD): validate_password,
|
||||
vol.Optional(CONF_MANUAL_IP): STA_MANUAL_IP_SCHEMA,
|
||||
|
||||
vol.Optional(CONF_AP): WIFI_NETWORK_AP,
|
||||
vol.Optional(CONF_DOMAIN, default='.local'): cv.domain_name,
|
||||
vol.Optional(CONF_REBOOT_TIMEOUT): cv.positive_time_period_milliseconds,
|
||||
vol.Optional(CONF_POWER_SAVE_MODE): cv.one_of(*WIFI_POWER_SAVE_MODES, upper=True),
|
||||
vol.Optional(CONF_FAST_CONNECT): cv.boolean,
|
||||
vol.Optional(CONF_USE_ADDRESS): cv.string_strict,
|
||||
|
||||
vol.Optional('hostname'): cv.invalid("The hostname option has been removed in 1.11.0"),
|
||||
}), validate)
|
||||
|
||||
|
||||
def safe_ip(ip):
|
||||
if ip is None:
|
||||
return IPAddress(0, 0, 0, 0)
|
||||
return IPAddress(*ip.args)
|
||||
|
||||
|
||||
def manual_ip(config):
|
||||
if config is None:
|
||||
return None
|
||||
return StructInitializer(
|
||||
ManualIP,
|
||||
('static_ip', safe_ip(config[CONF_STATIC_IP])),
|
||||
('gateway', safe_ip(config[CONF_GATEWAY])),
|
||||
('subnet', safe_ip(config[CONF_SUBNET])),
|
||||
('dns1', safe_ip(config.get(CONF_DNS1))),
|
||||
('dns2', safe_ip(config.get(CONF_DNS2))),
|
||||
)
|
||||
|
||||
|
||||
def wifi_network(config, static_ip):
|
||||
ap = variable(config[CONF_ID], WiFiAP())
|
||||
if CONF_SSID in config:
|
||||
add(ap.set_ssid(config[CONF_SSID]))
|
||||
if CONF_PASSWORD in config:
|
||||
add(ap.set_password(config[CONF_PASSWORD]))
|
||||
if CONF_BSSID in config:
|
||||
add(ap.set_bssid([HexInt(i) for i in config[CONF_BSSID].parts]))
|
||||
if CONF_HIDDEN in config:
|
||||
add(ap.set_hidden(config[CONF_HIDDEN]))
|
||||
if CONF_CHANNEL in config:
|
||||
add(ap.set_channel(config[CONF_CHANNEL]))
|
||||
if static_ip is not None:
|
||||
add(ap.set_manual_ip(manual_ip(static_ip)))
|
||||
|
||||
return ap
|
||||
|
||||
|
||||
def to_code(config):
|
||||
rhs = App.init_wifi()
|
||||
wifi = Pvariable(config[CONF_ID], rhs)
|
||||
add(wifi.set_use_address(config[CONF_USE_ADDRESS]))
|
||||
|
||||
for network in config.get(CONF_NETWORKS, []):
|
||||
add(wifi.add_sta(wifi_network(network, config.get(CONF_MANUAL_IP))))
|
||||
|
||||
if CONF_AP in config:
|
||||
add(wifi.set_ap(wifi_network(config[CONF_AP], config.get(CONF_MANUAL_IP))))
|
||||
|
||||
if CONF_REBOOT_TIMEOUT in config:
|
||||
add(wifi.set_reboot_timeout(config[CONF_REBOOT_TIMEOUT]))
|
||||
|
||||
if CONF_POWER_SAVE_MODE in config:
|
||||
add(wifi.set_power_save_mode(WIFI_POWER_SAVE_MODES[config[CONF_POWER_SAVE_MODE]]))
|
||||
|
||||
if CONF_FAST_CONNECT in config:
|
||||
add(wifi.set_fast_connect(config[CONF_FAST_CONNECT]))
|
||||
|
||||
|
||||
def lib_deps(config):
|
||||
if CORE.is_esp8266:
|
||||
return 'ESP8266WiFi'
|
||||
if CORE.is_esp32:
|
||||
return None
|
||||
raise NotImplementedError
|
||||
|
||||
|
||||
CONF_WIFI_CONNECTED = 'wifi.connected'
|
||||
WIFI_CONNECTED_CONDITION_SCHEMA = cv.Schema({})
|
||||
|
||||
|
||||
@CONDITION_REGISTRY.register(CONF_WIFI_CONNECTED, WIFI_CONNECTED_CONDITION_SCHEMA)
|
||||
def wifi_connected_to_code(config, condition_id, template_arg, args):
|
||||
rhs = WiFiConnectedCondition.new(template_arg)
|
||||
type = WiFiConnectedCondition.template(template_arg)
|
||||
yield Pvariable(condition_id, rhs, type=type)
|
||||
@@ -1,613 +0,0 @@
|
||||
from __future__ import print_function
|
||||
|
||||
from collections import OrderedDict
|
||||
import importlib
|
||||
import logging
|
||||
import re
|
||||
|
||||
import voluptuous as vol
|
||||
|
||||
from esphome import core, core_config, yaml_util
|
||||
from esphome.components import substitutions
|
||||
from esphome.const import CONF_ESPHOME, CONF_PLATFORM, ESP_PLATFORMS
|
||||
from esphome.core import CORE, EsphomeError
|
||||
from esphome.helpers import color, indent
|
||||
from esphome.py_compat import text_type
|
||||
from esphome.util import safe_print
|
||||
|
||||
# pylint: disable=unused-import, wrong-import-order
|
||||
from typing import List, Optional, Tuple, Union # noqa
|
||||
from esphome.core import ConfigType # noqa
|
||||
from esphome.yaml_util import is_secret
|
||||
from esphome.voluptuous_schema import ExtraKeysInvalid
|
||||
|
||||
_LOGGER = logging.getLogger(__name__)
|
||||
|
||||
_COMPONENT_CACHE = {}
|
||||
|
||||
|
||||
def get_component(domain):
|
||||
if domain in _COMPONENT_CACHE:
|
||||
return _COMPONENT_CACHE[domain]
|
||||
|
||||
path = 'esphome.components.{}'.format(domain)
|
||||
try:
|
||||
module = importlib.import_module(path)
|
||||
except (ImportError, ValueError) as err:
|
||||
_LOGGER.debug(err)
|
||||
else:
|
||||
_COMPONENT_CACHE[domain] = module
|
||||
return module
|
||||
|
||||
_LOGGER.error("Unable to find component %s", domain)
|
||||
return None
|
||||
|
||||
|
||||
def get_platform(domain, platform):
|
||||
return get_component("{}.{}".format(domain, platform))
|
||||
|
||||
|
||||
def is_platform_component(component):
|
||||
return hasattr(component, 'PLATFORM_SCHEMA')
|
||||
|
||||
|
||||
def iter_components(config):
|
||||
for domain, conf in config.items():
|
||||
if domain == CONF_ESPHOME:
|
||||
yield CONF_ESPHOME, core_config, conf
|
||||
continue
|
||||
component = get_component(domain)
|
||||
if getattr(component, 'MULTI_CONF', False):
|
||||
for conf_ in conf:
|
||||
yield domain, component, conf_
|
||||
else:
|
||||
yield domain, component, conf
|
||||
if is_platform_component(component):
|
||||
for p_config in conf:
|
||||
p_name = u"{}.{}".format(domain, p_config[CONF_PLATFORM])
|
||||
platform = get_component(p_name)
|
||||
yield p_name, platform, p_config
|
||||
|
||||
|
||||
ConfigPath = List[Union[str, int]]
|
||||
|
||||
|
||||
def _path_begins_with_(path, other): # type: (ConfigPath, ConfigPath) -> bool
|
||||
if len(path) < len(other):
|
||||
return False
|
||||
return path[:len(other)] == other
|
||||
|
||||
|
||||
def _path_begins_with(path, other): # type: (ConfigPath, ConfigPath) -> bool
|
||||
ret = _path_begins_with_(path, other)
|
||||
return ret
|
||||
|
||||
|
||||
class Config(OrderedDict):
|
||||
def __init__(self):
|
||||
super(Config, self).__init__()
|
||||
self.errors = [] # type: List[Tuple[basestring, ConfigPath]]
|
||||
self.domains = [] # type: List[Tuple[ConfigPath, basestring]]
|
||||
|
||||
def add_error(self, message, path):
|
||||
# type: (basestring, ConfigPath) -> None
|
||||
if not isinstance(message, text_type):
|
||||
message = text_type(message)
|
||||
self.errors.append((message, path))
|
||||
|
||||
def add_domain(self, path, name):
|
||||
# type: (ConfigPath, basestring) -> None
|
||||
self.domains.append((path, name))
|
||||
|
||||
def remove_domain(self, path, name):
|
||||
self.domains.remove((path, name))
|
||||
|
||||
def lookup_domain(self, path):
|
||||
# type: (ConfigPath) -> Optional[basestring]
|
||||
best_len = 0
|
||||
best_domain = None
|
||||
for d_path, domain in self.domains:
|
||||
if len(d_path) < best_len:
|
||||
continue
|
||||
if _path_begins_with(path, d_path):
|
||||
best_len = len(d_path)
|
||||
best_domain = domain
|
||||
return best_domain
|
||||
|
||||
def is_in_error_path(self, path):
|
||||
for _, p in self.errors:
|
||||
if _path_begins_with(p, path):
|
||||
return True
|
||||
return False
|
||||
|
||||
def get_error_for_path(self, path):
|
||||
for msg, p in self.errors:
|
||||
if self.nested_item_path(p) == path:
|
||||
return msg
|
||||
return None
|
||||
|
||||
def nested_item(self, path):
|
||||
data = self
|
||||
for item_index in path:
|
||||
try:
|
||||
data = data[item_index]
|
||||
except (KeyError, IndexError, TypeError):
|
||||
return {}
|
||||
return data
|
||||
|
||||
def nested_item_path(self, path):
|
||||
data = self
|
||||
part = []
|
||||
for item_index in path:
|
||||
try:
|
||||
data = data[item_index]
|
||||
except (KeyError, IndexError, TypeError):
|
||||
return part
|
||||
part.append(item_index)
|
||||
return part
|
||||
|
||||
|
||||
def iter_ids(config, path=None):
|
||||
path = path or []
|
||||
if isinstance(config, core.ID):
|
||||
yield config, path
|
||||
elif isinstance(config, core.Lambda):
|
||||
for id in config.requires_ids:
|
||||
yield id, path
|
||||
elif isinstance(config, list):
|
||||
for i, item in enumerate(config):
|
||||
for result in iter_ids(item, path + [i]):
|
||||
yield result
|
||||
elif isinstance(config, dict):
|
||||
for key, value in config.items():
|
||||
for result in iter_ids(value, path + [key]):
|
||||
yield result
|
||||
|
||||
|
||||
def do_id_pass(result): # type: (Config) -> None
|
||||
from esphome.cpp_generator import MockObjClass
|
||||
|
||||
declare_ids = [] # type: List[Tuple[core.ID, ConfigPath]]
|
||||
searching_ids = [] # type: List[Tuple[core.ID, ConfigPath]]
|
||||
for id, path in iter_ids(result):
|
||||
if id.is_declaration:
|
||||
if id.id is not None and any(v[0].id == id.id for v in declare_ids):
|
||||
result.add_error(u"ID {} redefined!".format(id.id), path)
|
||||
continue
|
||||
declare_ids.append((id, path))
|
||||
else:
|
||||
searching_ids.append((id, path))
|
||||
# Resolve default ids after manual IDs
|
||||
for id, _ in declare_ids:
|
||||
id.resolve([v[0].id for v in declare_ids])
|
||||
|
||||
# Check searched IDs
|
||||
for id, path in searching_ids:
|
||||
if id.id is not None:
|
||||
# manually declared
|
||||
match = next((v[0] for v in declare_ids if v[0].id == id.id), None)
|
||||
if match is None:
|
||||
# No declared ID with this name
|
||||
result.add_error("Couldn't find ID '{}'".format(id.id), path)
|
||||
continue
|
||||
if not isinstance(match.type, MockObjClass) or not isinstance(id.type, MockObjClass):
|
||||
continue
|
||||
if not match.type.inherits_from(id.type):
|
||||
result.add_error("ID '{}' of type {} doesn't inherit from {}. Please double check "
|
||||
"your ID is pointing to the correct value"
|
||||
"".format(id.id, match.type, id.type), path)
|
||||
|
||||
if id.id is None and id.type is not None:
|
||||
for v in declare_ids:
|
||||
if v[0] is None or not isinstance(v[0].type, MockObjClass):
|
||||
continue
|
||||
inherits = v[0].type.inherits_from(id.type)
|
||||
if inherits:
|
||||
id.id = v[0].id
|
||||
break
|
||||
else:
|
||||
result.add_error("Couldn't resolve ID for type '{}'".format(id.type), path)
|
||||
|
||||
|
||||
def validate_config(config):
|
||||
result = Config()
|
||||
|
||||
def _comp_error(ex, path):
|
||||
# type: (vol.Invalid, List[basestring]) -> None
|
||||
if isinstance(ex, vol.MultipleInvalid):
|
||||
errors = ex.errors
|
||||
else:
|
||||
errors = [ex]
|
||||
|
||||
for e in errors:
|
||||
path_ = path + e.path
|
||||
domain = result.lookup_domain(path_) or ''
|
||||
result.add_error(_format_vol_invalid(e, config, path, domain), path_)
|
||||
|
||||
skip_paths = list() # type: List[ConfigPath]
|
||||
|
||||
# Step 1: Load everything
|
||||
result.add_domain([CONF_ESPHOME], CONF_ESPHOME)
|
||||
result[CONF_ESPHOME] = config[CONF_ESPHOME]
|
||||
|
||||
for domain, conf in config.items():
|
||||
domain = str(domain)
|
||||
if domain == CONF_ESPHOME or domain.startswith(u'.'):
|
||||
skip_paths.append([domain])
|
||||
continue
|
||||
result.add_domain([domain], domain)
|
||||
result[domain] = conf
|
||||
if conf is None:
|
||||
result[domain] = conf = {}
|
||||
component = get_component(domain)
|
||||
if component is None:
|
||||
result.add_error(u"Component not found: {}".format(domain), [domain])
|
||||
skip_paths.append([domain])
|
||||
continue
|
||||
|
||||
if not isinstance(conf, list) and getattr(component, 'MULTI_CONF', False):
|
||||
result[domain] = conf = [conf]
|
||||
|
||||
success = True
|
||||
dependencies = getattr(component, 'DEPENDENCIES', [])
|
||||
for dependency in dependencies:
|
||||
if dependency not in config:
|
||||
result.add_error(u"Component {} requires component {}".format(domain, dependency),
|
||||
[domain])
|
||||
success = False
|
||||
if not success:
|
||||
skip_paths.append([domain])
|
||||
continue
|
||||
|
||||
success = True
|
||||
conflicts_with = getattr(component, 'CONFLICTS_WITH', [])
|
||||
for conflict in conflicts_with:
|
||||
if conflict in config:
|
||||
result.add_error(u"Component {} cannot be used together with component {}"
|
||||
u"".format(domain, conflict), [domain])
|
||||
success = False
|
||||
if not success:
|
||||
skip_paths.append([domain])
|
||||
continue
|
||||
|
||||
esp_platforms = getattr(component, 'ESP_PLATFORMS', ESP_PLATFORMS)
|
||||
if CORE.esp_platform not in esp_platforms:
|
||||
result.add_error(u"Component {} doesn't support {}.".format(domain, CORE.esp_platform),
|
||||
[domain])
|
||||
skip_paths.append([domain])
|
||||
continue
|
||||
|
||||
if not hasattr(component, 'PLATFORM_SCHEMA'):
|
||||
continue
|
||||
|
||||
result.remove_domain([domain], domain)
|
||||
|
||||
if not isinstance(conf, list) and conf:
|
||||
result[domain] = conf = [conf]
|
||||
|
||||
for i, p_config in enumerate(conf):
|
||||
if not isinstance(p_config, dict):
|
||||
result.add_error(u"Platform schemas must have 'platform:' key", [domain, i])
|
||||
skip_paths.append([domain, i])
|
||||
continue
|
||||
p_name = p_config.get('platform')
|
||||
if p_name is None:
|
||||
result.add_error(u"No platform specified for {}".format(domain), [domain, i])
|
||||
skip_paths.append([domain, i])
|
||||
continue
|
||||
p_domain = u'{}.{}'.format(domain, p_name)
|
||||
result.add_domain([domain, i], p_domain)
|
||||
platform = get_platform(domain, p_name)
|
||||
if platform is None:
|
||||
result.add_error(u"Platform not found: '{}'".format(p_domain), [domain, i])
|
||||
skip_paths.append([domain, i])
|
||||
continue
|
||||
|
||||
success = True
|
||||
dependencies = getattr(platform, 'DEPENDENCIES', [])
|
||||
for dependency in dependencies:
|
||||
if dependency not in config:
|
||||
result.add_error(u"Platform {} requires component {}"
|
||||
u"".format(p_domain, dependency), [domain, i])
|
||||
success = False
|
||||
if not success:
|
||||
skip_paths.append([domain, i])
|
||||
continue
|
||||
|
||||
success = True
|
||||
conflicts_with = getattr(platform, 'CONFLICTS_WITH', [])
|
||||
for conflict in conflicts_with:
|
||||
if conflict in config:
|
||||
result.add_error(u"Platform {} cannot be used together with component {}"
|
||||
u"".format(p_domain, conflict), [domain, i])
|
||||
success = False
|
||||
if not success:
|
||||
skip_paths.append([domain, i])
|
||||
continue
|
||||
|
||||
esp_platforms = getattr(platform, 'ESP_PLATFORMS', ESP_PLATFORMS)
|
||||
if CORE.esp_platform not in esp_platforms:
|
||||
result.add_error(u"Platform {} doesn't support {}."
|
||||
u"".format(p_domain, CORE.esp_platform), [domain, i])
|
||||
skip_paths.append([domain, i])
|
||||
continue
|
||||
|
||||
# Step 2: Validate configuration
|
||||
try:
|
||||
result[CONF_ESPHOME] = core_config.CONFIG_SCHEMA(result[CONF_ESPHOME])
|
||||
except vol.Invalid as ex:
|
||||
_comp_error(ex, [CONF_ESPHOME])
|
||||
|
||||
for domain, conf in result.items():
|
||||
domain = str(domain)
|
||||
if [domain] in skip_paths:
|
||||
continue
|
||||
component = get_component(domain)
|
||||
|
||||
if hasattr(component, 'CONFIG_SCHEMA'):
|
||||
multi_conf = getattr(component, 'MULTI_CONF', False)
|
||||
|
||||
if multi_conf:
|
||||
for i, conf_ in enumerate(conf):
|
||||
try:
|
||||
validated = component.CONFIG_SCHEMA(conf_)
|
||||
result[domain][i] = validated
|
||||
except vol.Invalid as ex:
|
||||
_comp_error(ex, [domain, i])
|
||||
else:
|
||||
try:
|
||||
validated = component.CONFIG_SCHEMA(conf)
|
||||
result[domain] = validated
|
||||
except vol.Invalid as ex:
|
||||
_comp_error(ex, [domain])
|
||||
continue
|
||||
|
||||
if not hasattr(component, 'PLATFORM_SCHEMA'):
|
||||
continue
|
||||
|
||||
for i, p_config in enumerate(conf):
|
||||
if [domain, i] in skip_paths:
|
||||
continue
|
||||
p_name = p_config['platform']
|
||||
platform = get_platform(domain, p_name)
|
||||
|
||||
if hasattr(platform, 'PLATFORM_SCHEMA'):
|
||||
try:
|
||||
p_validated = platform.PLATFORM_SCHEMA(p_config)
|
||||
except vol.Invalid as ex:
|
||||
_comp_error(ex, [domain, i])
|
||||
continue
|
||||
result[domain][i] = p_validated
|
||||
|
||||
if not result.errors:
|
||||
# Only parse IDs if no validation error. Otherwise
|
||||
# user gets confusing messages
|
||||
do_id_pass(result)
|
||||
return result
|
||||
|
||||
|
||||
def _nested_getitem(data, path):
|
||||
for item_index in path:
|
||||
try:
|
||||
data = data[item_index]
|
||||
except (KeyError, IndexError, TypeError):
|
||||
return None
|
||||
return data
|
||||
|
||||
|
||||
def humanize_error(config, validation_error):
|
||||
offending_item_summary = _nested_getitem(config, validation_error.path)
|
||||
if isinstance(offending_item_summary, dict):
|
||||
offending_item_summary = None
|
||||
validation_error = text_type(validation_error)
|
||||
m = re.match(r'^(.*?)\s*(?:for dictionary value )?@ data\[.*$', validation_error)
|
||||
if m is not None:
|
||||
validation_error = m.group(1)
|
||||
validation_error = validation_error.strip()
|
||||
if not validation_error.endswith(u'.'):
|
||||
validation_error += u'.'
|
||||
if offending_item_summary is None or is_secret(offending_item_summary):
|
||||
return validation_error
|
||||
|
||||
return u"{} Got '{}'".format(validation_error, offending_item_summary)
|
||||
|
||||
|
||||
def _format_vol_invalid(ex, config, path, domain):
|
||||
# type: (vol.Invalid, ConfigType, ConfigPath, basestring) -> unicode
|
||||
message = u''
|
||||
try:
|
||||
paren = ex.path[-2]
|
||||
except IndexError:
|
||||
paren = domain
|
||||
|
||||
if isinstance(ex, ExtraKeysInvalid):
|
||||
if ex.candidates:
|
||||
message += u'[{}] is an invalid option for [{}]. Did you mean {}?'.format(
|
||||
ex.path[-1], paren, u', '.join(u'[{}]'.format(x) for x in ex.candidates))
|
||||
else:
|
||||
message += u'[{}] is an invalid option for [{}]. Please check the indentation.'.format(
|
||||
ex.path[-1], paren)
|
||||
elif u'extra keys not allowed' in ex.error_message:
|
||||
message += u'[{}] is an invalid option for [{}].'.format(ex.path[-1], paren)
|
||||
elif u'required key not provided' in ex.error_message:
|
||||
message += u"'{}' is a required option for [{}].".format(ex.path[-1], paren)
|
||||
else:
|
||||
message += humanize_error(_nested_getitem(config, path), ex)
|
||||
|
||||
return message
|
||||
|
||||
|
||||
def load_config():
|
||||
try:
|
||||
config = yaml_util.load_yaml(CORE.config_path)
|
||||
except OSError:
|
||||
raise EsphomeError(u"Invalid YAML at {}. Please see YAML syntax reference or use an online "
|
||||
u"YAML syntax validator".format(CORE.config_path))
|
||||
CORE.raw_config = config
|
||||
config = substitutions.do_substitution_pass(config)
|
||||
core_config.preload_core_config(config)
|
||||
|
||||
try:
|
||||
result = validate_config(config)
|
||||
except EsphomeError:
|
||||
raise
|
||||
except Exception:
|
||||
_LOGGER.error(u"Unexpected exception while reading configuration:")
|
||||
raise
|
||||
|
||||
return result
|
||||
|
||||
|
||||
def line_info(obj, highlight=True):
|
||||
"""Display line config source."""
|
||||
if not highlight:
|
||||
return None
|
||||
if hasattr(obj, '__config_file__'):
|
||||
return color('cyan', "[source {}:{}]"
|
||||
.format(obj.__config_file__, obj.__line__ or '?'))
|
||||
return None
|
||||
|
||||
|
||||
def _print_on_next_line(obj):
|
||||
if isinstance(obj, (list, tuple, dict)):
|
||||
return True
|
||||
if isinstance(obj, str):
|
||||
return len(obj) > 80
|
||||
if isinstance(obj, core.Lambda):
|
||||
return len(obj.value) > 80
|
||||
return False
|
||||
|
||||
|
||||
def dump_dict(config, path, at_root=True):
|
||||
# type: (Config, ConfigPath, bool) -> Tuple[unicode, bool]
|
||||
conf = config.nested_item(path)
|
||||
ret = u''
|
||||
multiline = False
|
||||
|
||||
if at_root:
|
||||
error = config.get_error_for_path(path)
|
||||
if error is not None:
|
||||
ret += u'\n' + color('bold_red', error) + u'\n'
|
||||
|
||||
if isinstance(conf, (list, tuple)):
|
||||
multiline = True
|
||||
if not conf:
|
||||
ret += u'[]'
|
||||
multiline = False
|
||||
|
||||
for i in range(len(conf)):
|
||||
path_ = path + [i]
|
||||
error = config.get_error_for_path(path_)
|
||||
if error is not None:
|
||||
ret += u'\n' + color('bold_red', error) + u'\n'
|
||||
|
||||
sep = u'- '
|
||||
if config.is_in_error_path(path_):
|
||||
sep = color('red', sep)
|
||||
msg, _ = dump_dict(config, path_, at_root=False)
|
||||
msg = indent(msg)
|
||||
inf = line_info(config.nested_item(path_), highlight=config.is_in_error_path(path_))
|
||||
if inf is not None:
|
||||
msg = inf + u'\n' + msg
|
||||
elif msg:
|
||||
msg = msg[2:]
|
||||
ret += sep + msg + u'\n'
|
||||
elif isinstance(conf, dict):
|
||||
multiline = True
|
||||
if not conf:
|
||||
ret += u'{}'
|
||||
multiline = False
|
||||
|
||||
for k in conf.keys():
|
||||
path_ = path + [k]
|
||||
error = config.get_error_for_path(path_)
|
||||
if error is not None:
|
||||
ret += u'\n' + color('bold_red', error) + u'\n'
|
||||
|
||||
st = u'{}: '.format(k)
|
||||
if config.is_in_error_path(path_):
|
||||
st = color('red', st)
|
||||
msg, m = dump_dict(config, path_, at_root=False)
|
||||
|
||||
inf = line_info(config.nested_item(path_), highlight=config.is_in_error_path(path_))
|
||||
if m:
|
||||
msg = u'\n' + indent(msg)
|
||||
|
||||
if inf is not None:
|
||||
if m:
|
||||
msg = u' ' + inf + msg
|
||||
else:
|
||||
msg = msg + u' ' + inf
|
||||
ret += st + msg + u'\n'
|
||||
elif isinstance(conf, str):
|
||||
if is_secret(conf):
|
||||
conf = u'!secret {}'.format(is_secret(conf))
|
||||
if not conf:
|
||||
conf += u"''"
|
||||
|
||||
if len(conf) > 80:
|
||||
conf = u'|-\n' + indent(conf)
|
||||
error = config.get_error_for_path(path)
|
||||
col = 'bold_red' if error else 'white'
|
||||
ret += color(col, text_type(conf))
|
||||
elif isinstance(conf, core.Lambda):
|
||||
if is_secret(conf):
|
||||
conf = u'!secret {}'.format(is_secret(conf))
|
||||
|
||||
conf = u'!lambda |-\n' + indent(text_type(conf.value))
|
||||
error = config.get_error_for_path(path)
|
||||
col = 'bold_red' if error else 'white'
|
||||
ret += color(col, conf)
|
||||
elif conf is None:
|
||||
pass
|
||||
else:
|
||||
error = config.get_error_for_path(path)
|
||||
col = 'bold_red' if error else 'white'
|
||||
ret += color(col, text_type(conf))
|
||||
multiline = u'\n' in ret
|
||||
|
||||
return ret, multiline
|
||||
|
||||
|
||||
def strip_default_ids(config):
|
||||
if isinstance(config, list):
|
||||
to_remove = []
|
||||
for i, x in enumerate(config):
|
||||
x = config[i] = strip_default_ids(x)
|
||||
if isinstance(x, core.ID) and not x.is_manual:
|
||||
to_remove.append(x)
|
||||
for x in to_remove:
|
||||
config.remove(x)
|
||||
elif isinstance(config, dict):
|
||||
to_remove = []
|
||||
for k, v in config.items():
|
||||
v = config[k] = strip_default_ids(v)
|
||||
if isinstance(v, core.ID) and not v.is_manual:
|
||||
to_remove.append(k)
|
||||
for k in to_remove:
|
||||
config.pop(k)
|
||||
return config
|
||||
|
||||
|
||||
def read_config(verbose):
|
||||
_LOGGER.info("Reading configuration...")
|
||||
try:
|
||||
res = load_config()
|
||||
except EsphomeError as err:
|
||||
_LOGGER.error(u"Error while reading config: %s", err)
|
||||
return None
|
||||
if res.errors:
|
||||
if not verbose:
|
||||
res = strip_default_ids(res)
|
||||
|
||||
safe_print(color('bold_red', u"Failed config"))
|
||||
safe_print('')
|
||||
for path, domain in res.domains:
|
||||
if not res.is_in_error_path(path):
|
||||
continue
|
||||
|
||||
safe_print(color('bold_red', u'{}:'.format(domain)) + u' ' +
|
||||
(line_info(res.nested_item(path)) or u''))
|
||||
safe_print(indent(dump_dict(res, path)[0]))
|
||||
return None
|
||||
return OrderedDict(res)
|
||||
461
esphome/core.py
461
esphome/core.py
@@ -1,461 +0,0 @@
|
||||
import collections
|
||||
from collections import OrderedDict
|
||||
import inspect
|
||||
import logging
|
||||
import math
|
||||
import os
|
||||
import re
|
||||
|
||||
# pylint: disable=unused-import, wrong-import-order
|
||||
from typing import Any, Dict, List # noqa
|
||||
|
||||
from esphome.const import CONF_ARDUINO_VERSION, CONF_ESPHOME, CONF_ESPHOME_CORE_VERSION, \
|
||||
CONF_LOCAL, CONF_USE_ADDRESS, CONF_WIFI, ESP_PLATFORM_ESP32, ESP_PLATFORM_ESP8266, \
|
||||
CONF_REPOSITORY, CONF_BRANCH
|
||||
from esphome.helpers import ensure_unique_string, is_hassio
|
||||
from esphome.py_compat import IS_PY2, integer_types
|
||||
|
||||
_LOGGER = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class EsphomeError(Exception):
|
||||
"""General ESPHome exception occurred."""
|
||||
|
||||
|
||||
if IS_PY2:
|
||||
base_int = long
|
||||
else:
|
||||
base_int = int
|
||||
|
||||
|
||||
class HexInt(base_int):
|
||||
def __str__(self):
|
||||
if 0 <= self <= 255:
|
||||
return "0x{:02X}".format(self)
|
||||
return "0x{:X}".format(self)
|
||||
|
||||
|
||||
class IPAddress(object):
|
||||
def __init__(self, *args):
|
||||
if len(args) != 4:
|
||||
raise ValueError(u"IPAddress must consist up 4 items")
|
||||
self.args = args
|
||||
|
||||
def __str__(self):
|
||||
return '.'.join(str(x) for x in self.args)
|
||||
|
||||
|
||||
class MACAddress(object):
|
||||
def __init__(self, *parts):
|
||||
if len(parts) != 6:
|
||||
raise ValueError(u"MAC Address must consist of 6 items")
|
||||
self.parts = parts
|
||||
|
||||
def __str__(self):
|
||||
return ':'.join('{:02X}'.format(part) for part in self.parts)
|
||||
|
||||
def as_hex(self):
|
||||
from esphome.cpp_generator import RawExpression
|
||||
|
||||
num = ''.join('{:02X}'.format(part) for part in self.parts)
|
||||
return RawExpression('0x{}ULL'.format(num))
|
||||
|
||||
|
||||
def is_approximately_integer(value):
|
||||
if isinstance(value, integer_types):
|
||||
return True
|
||||
return abs(value - round(value)) < 0.001
|
||||
|
||||
|
||||
class TimePeriod(object):
|
||||
def __init__(self, microseconds=None, milliseconds=None, seconds=None,
|
||||
minutes=None, hours=None, days=None):
|
||||
if days is not None:
|
||||
if not is_approximately_integer(days):
|
||||
frac_days, days = math.modf(days)
|
||||
hours = (hours or 0) + frac_days * 24
|
||||
self.days = int(round(days))
|
||||
else:
|
||||
self.days = None
|
||||
|
||||
if hours is not None:
|
||||
if not is_approximately_integer(hours):
|
||||
frac_hours, hours = math.modf(hours)
|
||||
minutes = (minutes or 0) + frac_hours * 60
|
||||
self.hours = int(round(hours))
|
||||
else:
|
||||
self.hours = None
|
||||
|
||||
if minutes is not None:
|
||||
if not is_approximately_integer(minutes):
|
||||
frac_minutes, minutes = math.modf(minutes)
|
||||
seconds = (seconds or 0) + frac_minutes * 60
|
||||
self.minutes = int(round(minutes))
|
||||
else:
|
||||
self.minutes = None
|
||||
|
||||
if seconds is not None:
|
||||
if not is_approximately_integer(seconds):
|
||||
frac_seconds, seconds = math.modf(seconds)
|
||||
milliseconds = (milliseconds or 0) + frac_seconds * 1000
|
||||
self.seconds = int(round(seconds))
|
||||
else:
|
||||
self.seconds = None
|
||||
|
||||
if milliseconds is not None:
|
||||
if not is_approximately_integer(milliseconds):
|
||||
frac_milliseconds, milliseconds = math.modf(milliseconds)
|
||||
microseconds = (microseconds or 0) + frac_milliseconds * 1000
|
||||
self.milliseconds = int(round(milliseconds))
|
||||
else:
|
||||
self.milliseconds = None
|
||||
|
||||
if microseconds is not None:
|
||||
if not is_approximately_integer(microseconds):
|
||||
raise ValueError("Maximum precision is microseconds")
|
||||
self.microseconds = int(round(microseconds))
|
||||
else:
|
||||
self.microseconds = None
|
||||
|
||||
def as_dict(self):
|
||||
out = OrderedDict()
|
||||
if self.microseconds is not None:
|
||||
out['microseconds'] = self.microseconds
|
||||
if self.milliseconds is not None:
|
||||
out['milliseconds'] = self.milliseconds
|
||||
if self.seconds is not None:
|
||||
out['seconds'] = self.seconds
|
||||
if self.minutes is not None:
|
||||
out['minutes'] = self.minutes
|
||||
if self.hours is not None:
|
||||
out['hours'] = self.hours
|
||||
if self.days is not None:
|
||||
out['days'] = self.days
|
||||
return out
|
||||
|
||||
def __str__(self):
|
||||
if self.microseconds is not None:
|
||||
return '{} us'.format(self.total_microseconds)
|
||||
if self.milliseconds is not None:
|
||||
return '{} ms'.format(self.total_milliseconds)
|
||||
if self.seconds is not None:
|
||||
return '{} s'.format(self.total_seconds)
|
||||
if self.minutes is not None:
|
||||
return '{} min'.format(self.total_minutes)
|
||||
if self.hours is not None:
|
||||
return '{} h'.format(self.total_hours)
|
||||
if self.days is not None:
|
||||
return '{} d'.format(self.total_days)
|
||||
return '0'
|
||||
|
||||
@property
|
||||
def total_microseconds(self):
|
||||
return self.total_milliseconds * 1000 + (self.microseconds or 0)
|
||||
|
||||
@property
|
||||
def total_milliseconds(self):
|
||||
return self.total_seconds * 1000 + (self.milliseconds or 0)
|
||||
|
||||
@property
|
||||
def total_seconds(self):
|
||||
return self.total_minutes * 60 + (self.seconds or 0)
|
||||
|
||||
@property
|
||||
def total_minutes(self):
|
||||
return self.total_hours * 60 + (self.minutes or 0)
|
||||
|
||||
@property
|
||||
def total_hours(self):
|
||||
return self.total_days * 24 + (self.hours or 0)
|
||||
|
||||
@property
|
||||
def total_days(self):
|
||||
return self.days or 0
|
||||
|
||||
def __eq__(self, other):
|
||||
if not isinstance(other, TimePeriod):
|
||||
raise ValueError("other must be TimePeriod")
|
||||
return self.total_microseconds == other.total_microseconds
|
||||
|
||||
def __ne__(self, other):
|
||||
if not isinstance(other, TimePeriod):
|
||||
raise ValueError("other must be TimePeriod")
|
||||
return self.total_microseconds != other.total_microseconds
|
||||
|
||||
def __lt__(self, other):
|
||||
if not isinstance(other, TimePeriod):
|
||||
raise ValueError("other must be TimePeriod")
|
||||
return self.total_microseconds < other.total_microseconds
|
||||
|
||||
def __gt__(self, other):
|
||||
if not isinstance(other, TimePeriod):
|
||||
raise ValueError("other must be TimePeriod")
|
||||
return self.total_microseconds > other.total_microseconds
|
||||
|
||||
def __le__(self, other):
|
||||
if not isinstance(other, TimePeriod):
|
||||
raise ValueError("other must be TimePeriod")
|
||||
return self.total_microseconds <= other.total_microseconds
|
||||
|
||||
def __ge__(self, other):
|
||||
if not isinstance(other, TimePeriod):
|
||||
raise ValueError("other must be TimePeriod")
|
||||
return self.total_microseconds >= other.total_microseconds
|
||||
|
||||
|
||||
class TimePeriodMicroseconds(TimePeriod):
|
||||
pass
|
||||
|
||||
|
||||
class TimePeriodMilliseconds(TimePeriod):
|
||||
pass
|
||||
|
||||
|
||||
class TimePeriodSeconds(TimePeriod):
|
||||
pass
|
||||
|
||||
|
||||
class TimePeriodMinutes(TimePeriod):
|
||||
pass
|
||||
|
||||
|
||||
LAMBDA_PROG = re.compile(r'id\(\s*([a-zA-Z_][a-zA-Z0-9_]*)\s*\)(\.?)')
|
||||
|
||||
|
||||
class Lambda(object):
|
||||
def __init__(self, value):
|
||||
self._value = value
|
||||
self._parts = None
|
||||
self._requires_ids = None
|
||||
|
||||
@property
|
||||
def parts(self):
|
||||
if self._parts is None:
|
||||
self._parts = re.split(LAMBDA_PROG, self._value)
|
||||
return self._parts
|
||||
|
||||
@property
|
||||
def requires_ids(self):
|
||||
if self._requires_ids is None:
|
||||
self._requires_ids = [ID(self.parts[i]) for i in range(1, len(self.parts), 3)]
|
||||
return self._requires_ids
|
||||
|
||||
@property
|
||||
def value(self):
|
||||
return self._value
|
||||
|
||||
@value.setter
|
||||
def value(self, value):
|
||||
self._value = value
|
||||
self._parts = None
|
||||
self._requires_ids = None
|
||||
|
||||
def __str__(self):
|
||||
return self.value
|
||||
|
||||
def __repr__(self):
|
||||
return u'Lambda<{}>'.format(self.value)
|
||||
|
||||
|
||||
class ID(object):
|
||||
def __init__(self, id, is_declaration=False, type=None):
|
||||
self.id = id
|
||||
self.is_manual = id is not None
|
||||
self.is_declaration = is_declaration
|
||||
self.type = type
|
||||
|
||||
def resolve(self, registered_ids):
|
||||
from esphome.config_validation import RESERVED_IDS
|
||||
|
||||
if self.id is None:
|
||||
base = str(self.type).replace('::', '_').lower()
|
||||
name = ''.join(c for c in base if c.isalnum() or c == '_')
|
||||
used = set(registered_ids) | set(RESERVED_IDS)
|
||||
self.id = ensure_unique_string(name, used)
|
||||
return self.id
|
||||
|
||||
def __str__(self):
|
||||
if self.id is None:
|
||||
return ''
|
||||
return self.id
|
||||
|
||||
def __repr__(self):
|
||||
return u'ID<{} declaration={}, type={}, manual={}>'.format(
|
||||
self.id, self.is_declaration, self.type, self.is_manual)
|
||||
|
||||
def __eq__(self, other):
|
||||
if not isinstance(other, ID):
|
||||
raise ValueError("other must be ID")
|
||||
return self.id == other.id
|
||||
|
||||
def __hash__(self):
|
||||
return hash(self.id)
|
||||
|
||||
|
||||
# pylint: disable=too-many-instance-attributes,too-many-public-methods
|
||||
class EsphomeCore(object):
|
||||
def __init__(self):
|
||||
# True if command is run from dashboard
|
||||
self.dashboard = False
|
||||
# The name of the node
|
||||
self.name = None # type: str
|
||||
# The relative path to the configuration YAML
|
||||
self.config_path = None # type: str
|
||||
# The relative path to where all build files are stored
|
||||
self.build_path = None # type: str
|
||||
# The platform (ESP8266, ESP32) of this device
|
||||
self.esp_platform = None # type: str
|
||||
# The board that's used (for example nodemcuv2)
|
||||
self.board = None # type: str
|
||||
# The full raw configuration
|
||||
self.raw_config = {} # type: ConfigType
|
||||
# The validated configuration, this is None until the config has been validated
|
||||
self.config = {} # type: ConfigType
|
||||
# The pending tasks in the task queue (mostly for C++ generation)
|
||||
self.pending_tasks = collections.deque()
|
||||
# The variable cache, for each ID this holds a MockObj of the variable obj
|
||||
self.variables = {} # type: Dict[str, MockObj]
|
||||
# The list of expressions for the C++ generation
|
||||
self.expressions = [] # type: List[Expression]
|
||||
|
||||
@property
|
||||
def address(self): # type: () -> str
|
||||
if 'wifi' in self.config:
|
||||
return self.config[CONF_WIFI][CONF_USE_ADDRESS]
|
||||
|
||||
if 'ethernet' in self.config:
|
||||
return self.config['ethernet'][CONF_USE_ADDRESS]
|
||||
|
||||
return None
|
||||
|
||||
@property
|
||||
def esphome_core_version(self): # type: () -> Dict[str, str]
|
||||
return self.config[CONF_ESPHOME][CONF_ESPHOME_CORE_VERSION]
|
||||
|
||||
@property
|
||||
def is_dev_esphome_core_version(self):
|
||||
if CONF_REPOSITORY not in self.esphome_core_version:
|
||||
return False
|
||||
return self.esphome_core_version.get(CONF_BRANCH) == 'dev'
|
||||
|
||||
@property
|
||||
def is_local_esphome_core_copy(self):
|
||||
return CONF_LOCAL in self.esphome_core_version
|
||||
|
||||
@property
|
||||
def arduino_version(self): # type: () -> str
|
||||
return self.config[CONF_ESPHOME][CONF_ARDUINO_VERSION]
|
||||
|
||||
@property
|
||||
def config_dir(self):
|
||||
return os.path.dirname(self.config_path)
|
||||
|
||||
@property
|
||||
def config_filename(self):
|
||||
return os.path.basename(self.config_path)
|
||||
|
||||
def relative_path(self, *path):
|
||||
path_ = os.path.expanduser(os.path.join(*path))
|
||||
return os.path.join(self.config_dir, path_)
|
||||
|
||||
def relative_build_path(self, *path):
|
||||
path_ = os.path.expanduser(os.path.join(*path))
|
||||
return os.path.join(self.build_path, path_)
|
||||
|
||||
def relative_pioenvs_path(self, *path):
|
||||
if is_hassio():
|
||||
return os.path.join('/data', self.name, '.pioenvs', *path)
|
||||
return self.relative_build_path('.pioenvs', *path)
|
||||
|
||||
def relative_piolibdeps_path(self, *path):
|
||||
if is_hassio():
|
||||
return os.path.join('/data', self.name, '.piolibdeps', *path)
|
||||
return self.relative_build_path('.piolibdeps', *path)
|
||||
|
||||
@property
|
||||
def firmware_bin(self):
|
||||
return self.relative_pioenvs_path(self.name, 'firmware.bin')
|
||||
|
||||
@property
|
||||
def is_esp8266(self):
|
||||
if self.esp_platform is None:
|
||||
raise ValueError
|
||||
return self.esp_platform == ESP_PLATFORM_ESP8266
|
||||
|
||||
@property
|
||||
def is_esp32(self):
|
||||
if self.esp_platform is None:
|
||||
raise ValueError
|
||||
return self.esp_platform == ESP_PLATFORM_ESP32
|
||||
|
||||
def add_job(self, func, *args, **kwargs):
|
||||
domain = kwargs.get('domain')
|
||||
if inspect.isgeneratorfunction(func):
|
||||
def func_():
|
||||
yield
|
||||
for _ in func(*args):
|
||||
yield
|
||||
else:
|
||||
def func_():
|
||||
yield
|
||||
func(*args)
|
||||
gen = func_()
|
||||
self.pending_tasks.append((gen, domain))
|
||||
return gen
|
||||
|
||||
def flush_tasks(self):
|
||||
i = 0
|
||||
while self.pending_tasks:
|
||||
i += 1
|
||||
if i > 1000000:
|
||||
raise EsphomeError("Circular dependency detected!")
|
||||
|
||||
task, domain = self.pending_tasks.popleft()
|
||||
_LOGGER.debug("Executing task for domain=%s", domain)
|
||||
try:
|
||||
next(task)
|
||||
self.pending_tasks.append((task, domain))
|
||||
except StopIteration:
|
||||
_LOGGER.debug(" -> %s finished", domain)
|
||||
|
||||
def add(self, expression, require=True):
|
||||
from esphome.cpp_generator import Expression
|
||||
|
||||
if require and isinstance(expression, Expression):
|
||||
expression.require()
|
||||
self.expressions.append(expression)
|
||||
_LOGGER.debug("Adding: %s", expression)
|
||||
return expression
|
||||
|
||||
def get_variable(self, id):
|
||||
while True:
|
||||
if id in self.variables:
|
||||
yield self.variables[id]
|
||||
return
|
||||
_LOGGER.debug("Waiting for variable %s", id)
|
||||
yield None
|
||||
|
||||
def get_variable_with_full_id(self, id):
|
||||
while True:
|
||||
if id in self.variables:
|
||||
for k, v in self.variables.items():
|
||||
if k == id:
|
||||
yield (k, v)
|
||||
return
|
||||
_LOGGER.debug("Waiting for variable %s", id)
|
||||
yield None, None
|
||||
|
||||
def register_variable(self, id, obj):
|
||||
if id in self.variables:
|
||||
raise EsphomeError("ID {} is already registered".format(id))
|
||||
_LOGGER.debug("Registered variable %s of type %s", id.id, id.type)
|
||||
self.variables[id] = obj
|
||||
|
||||
def has_id(self, id):
|
||||
return id in self.variables
|
||||
|
||||
|
||||
CORE = EsphomeCore()
|
||||
|
||||
ConfigType = Dict[str, Any]
|
||||
CoreType = EsphomeCore
|
||||
@@ -1,64 +0,0 @@
|
||||
from esphome.const import CONF_INVERTED, CONF_MODE, CONF_NUMBER, CONF_PCF8574, \
|
||||
CONF_SETUP_PRIORITY, CONF_MCP23017
|
||||
from esphome.core import CORE, EsphomeError
|
||||
from esphome.cpp_generator import IntLiteral, RawExpression
|
||||
from esphome.cpp_types import GPIOInputPin, GPIOOutputPin
|
||||
|
||||
|
||||
def generic_gpio_pin_expression_(conf, mock_obj, default_mode):
|
||||
if conf is None:
|
||||
return
|
||||
number = conf[CONF_NUMBER]
|
||||
inverted = conf.get(CONF_INVERTED)
|
||||
if CONF_PCF8574 in conf:
|
||||
from esphome.components import pcf8574
|
||||
|
||||
for hub in CORE.get_variable(conf[CONF_PCF8574]):
|
||||
yield None
|
||||
|
||||
if default_mode == u'INPUT':
|
||||
mode = pcf8574.PCF8675_GPIO_MODES[conf.get(CONF_MODE, u'INPUT')]
|
||||
yield hub.make_input_pin(number, mode, inverted)
|
||||
return
|
||||
if default_mode == u'OUTPUT':
|
||||
yield hub.make_output_pin(number, inverted)
|
||||
return
|
||||
|
||||
raise EsphomeError(u"Unknown default mode {}".format(default_mode))
|
||||
if CONF_MCP23017 in conf:
|
||||
from esphome.components import mcp23017
|
||||
|
||||
for hub in CORE.get_variable(conf[CONF_MCP23017]):
|
||||
yield None
|
||||
|
||||
if default_mode == u'INPUT':
|
||||
mode = mcp23017.MCP23017_GPIO_MODES[conf.get(CONF_MODE, u'INPUT')]
|
||||
yield hub.make_input_pin(number, mode, inverted)
|
||||
return
|
||||
if default_mode == u'OUTPUT':
|
||||
yield hub.make_output_pin(number, inverted)
|
||||
return
|
||||
|
||||
raise EsphomeError(u"Unknown default mode {}".format(default_mode))
|
||||
if len(conf) == 1:
|
||||
yield IntLiteral(number)
|
||||
return
|
||||
mode = RawExpression(conf.get(CONF_MODE, default_mode))
|
||||
yield mock_obj(number, mode, inverted)
|
||||
|
||||
|
||||
def gpio_output_pin_expression(conf):
|
||||
for exp in generic_gpio_pin_expression_(conf, GPIOOutputPin, 'OUTPUT'):
|
||||
yield None
|
||||
yield exp
|
||||
|
||||
|
||||
def gpio_input_pin_expression(conf):
|
||||
for exp in generic_gpio_pin_expression_(conf, GPIOInputPin, 'INPUT'):
|
||||
yield None
|
||||
yield exp
|
||||
|
||||
|
||||
def setup_component(obj, config):
|
||||
if CONF_SETUP_PRIORITY in config:
|
||||
CORE.add(obj.set_setup_priority(config[CONF_SETUP_PRIORITY]))
|
||||
@@ -1,36 +0,0 @@
|
||||
from esphome.cpp_generator import MockObj
|
||||
|
||||
global_ns = MockObj('', '')
|
||||
void = global_ns.namespace('void')
|
||||
float_ = global_ns.namespace('float')
|
||||
bool_ = global_ns.namespace('bool')
|
||||
std_ns = global_ns.namespace('std')
|
||||
std_string = std_ns.class_('string')
|
||||
std_vector = std_ns.class_('vector')
|
||||
uint8 = global_ns.namespace('uint8_t')
|
||||
uint16 = global_ns.namespace('uint16_t')
|
||||
uint32 = global_ns.namespace('uint32_t')
|
||||
int32 = global_ns.namespace('int32_t')
|
||||
const_char_ptr = global_ns.namespace('const char *')
|
||||
NAN = global_ns.namespace('NAN')
|
||||
esphome_ns = global_ns # using namespace esphome;
|
||||
App = esphome_ns.App
|
||||
io_ns = esphome_ns.namespace('io')
|
||||
Nameable = esphome_ns.class_('Nameable')
|
||||
Trigger = esphome_ns.class_('Trigger')
|
||||
Action = esphome_ns.class_('Action')
|
||||
Component = esphome_ns.class_('Component')
|
||||
ComponentPtr = Component.operator('ptr')
|
||||
PollingComponent = esphome_ns.class_('PollingComponent', Component)
|
||||
Application = esphome_ns.class_('Application')
|
||||
optional = esphome_ns.class_('optional')
|
||||
arduino_json_ns = global_ns.namespace('ArduinoJson')
|
||||
JsonObject = arduino_json_ns.class_('JsonObject')
|
||||
JsonObjectRef = JsonObject.operator('ref')
|
||||
JsonObjectConstRef = JsonObjectRef.operator('const')
|
||||
Controller = esphome_ns.class_('Controller')
|
||||
StoringController = esphome_ns.class_('StoringController', Controller)
|
||||
|
||||
GPIOPin = esphome_ns.class_('GPIOPin')
|
||||
GPIOOutputPin = esphome_ns.class_('GPIOOutputPin', GPIOPin)
|
||||
GPIOInputPin = esphome_ns.class_('GPIOInputPin', GPIOPin)
|
||||
@@ -1,638 +0,0 @@
|
||||
# pylint: disable=wrong-import-position
|
||||
from __future__ import print_function
|
||||
|
||||
import codecs
|
||||
import collections
|
||||
import hmac
|
||||
import json
|
||||
import logging
|
||||
import multiprocessing
|
||||
import os
|
||||
import subprocess
|
||||
import threading
|
||||
|
||||
import tornado
|
||||
import tornado.concurrent
|
||||
import tornado.gen
|
||||
import tornado.httpserver
|
||||
import tornado.ioloop
|
||||
import tornado.iostream
|
||||
from tornado.log import access_log
|
||||
import tornado.netutil
|
||||
import tornado.process
|
||||
import tornado.web
|
||||
import tornado.websocket
|
||||
|
||||
from esphome import const
|
||||
from esphome.__main__ import get_serial_ports
|
||||
from esphome.helpers import mkdir_p, get_bool_env, run_system_command
|
||||
from esphome.py_compat import IS_PY2
|
||||
from esphome.storage_json import EsphomeStorageJSON, StorageJSON, \
|
||||
esphome_storage_path, ext_storage_path
|
||||
from esphome.util import shlex_quote
|
||||
|
||||
# pylint: disable=unused-import, wrong-import-order
|
||||
from typing import Optional # noqa
|
||||
|
||||
from esphome.zeroconf import DashboardStatus, Zeroconf
|
||||
|
||||
_LOGGER = logging.getLogger(__name__)
|
||||
CONFIG_DIR = ''
|
||||
PASSWORD_DIGEST = ''
|
||||
COOKIE_SECRET = None
|
||||
USING_PASSWORD = False
|
||||
ON_HASSIO = False
|
||||
USING_HASSIO_AUTH = True
|
||||
HASSIO_MQTT_CONFIG = None
|
||||
RELATIVE_URL = os.getenv('ESPHOME_DASHBOARD_RELATIVE_URL', '/')
|
||||
STATUS_USE_PING = get_bool_env('ESPHOME_DASHBOARD_USE_PING')
|
||||
|
||||
if IS_PY2:
|
||||
cookie_authenticated_yes = 'yes'
|
||||
else:
|
||||
cookie_authenticated_yes = b'yes'
|
||||
|
||||
|
||||
def template_args():
|
||||
version = const.__version__
|
||||
return {
|
||||
'version': version,
|
||||
'docs_link': 'https://beta.esphome.io/' if 'b' in version else 'https://esphome.io/',
|
||||
'get_static_file_url': get_static_file_url,
|
||||
'relative_url': RELATIVE_URL,
|
||||
'streamer_mode': get_bool_env('ESPHOME_STREAMER_MODE'),
|
||||
}
|
||||
|
||||
|
||||
# pylint: disable=abstract-method
|
||||
class BaseHandler(tornado.web.RequestHandler):
|
||||
def is_authenticated(self):
|
||||
if USING_HASSIO_AUTH or USING_PASSWORD:
|
||||
return self.get_secure_cookie('authenticated') == cookie_authenticated_yes
|
||||
|
||||
return True
|
||||
|
||||
|
||||
# pylint: disable=abstract-method, arguments-differ
|
||||
class EsphomeCommandWebSocket(tornado.websocket.WebSocketHandler):
|
||||
def __init__(self, application, request, **kwargs):
|
||||
super(EsphomeCommandWebSocket, self).__init__(application, request, **kwargs)
|
||||
self.proc = None
|
||||
self.closed = False
|
||||
|
||||
def on_message(self, message):
|
||||
if USING_HASSIO_AUTH or USING_PASSWORD:
|
||||
if self.get_secure_cookie('authenticated') != cookie_authenticated_yes:
|
||||
return
|
||||
if self.proc is not None:
|
||||
return
|
||||
command = self.build_command(message)
|
||||
_LOGGER.info(u"Running command '%s'", ' '.join(shlex_quote(x) for x in command))
|
||||
self.proc = tornado.process.Subprocess(command,
|
||||
stdout=tornado.process.Subprocess.STREAM,
|
||||
stderr=subprocess.STDOUT)
|
||||
self.proc.set_exit_callback(self.proc_on_exit)
|
||||
tornado.ioloop.IOLoop.current().spawn_callback(self.redirect_stream)
|
||||
|
||||
@tornado.gen.coroutine
|
||||
def redirect_stream(self):
|
||||
while True:
|
||||
try:
|
||||
if IS_PY2:
|
||||
reg = '[\n\r]'
|
||||
else:
|
||||
reg = b'[\n\r]'
|
||||
data = yield self.proc.stdout.read_until_regex(reg)
|
||||
if not IS_PY2:
|
||||
data = data.decode('utf-8', 'backslashreplace')
|
||||
except tornado.iostream.StreamClosedError:
|
||||
break
|
||||
try:
|
||||
self.write_message({'event': 'line', 'data': data})
|
||||
except UnicodeDecodeError:
|
||||
data = codecs.decode(data, 'utf8', 'replace')
|
||||
self.write_message({'event': 'line', 'data': data})
|
||||
|
||||
def proc_on_exit(self, returncode):
|
||||
if not self.closed:
|
||||
_LOGGER.debug("Process exited with return code %s", returncode)
|
||||
self.write_message({'event': 'exit', 'code': returncode})
|
||||
|
||||
def on_close(self):
|
||||
self.closed = True
|
||||
if self.proc is not None and self.proc.returncode is None:
|
||||
_LOGGER.debug("Terminating process")
|
||||
self.proc.proc.terminate()
|
||||
|
||||
def build_command(self, message):
|
||||
raise NotImplementedError
|
||||
|
||||
|
||||
class EsphomeLogsHandler(EsphomeCommandWebSocket):
|
||||
def build_command(self, message):
|
||||
js = json.loads(message)
|
||||
config_file = CONFIG_DIR + '/' + js['configuration']
|
||||
return ["esphome", "--dashboard", config_file, "logs", '--serial-port', js["port"]]
|
||||
|
||||
|
||||
class EsphomeRunHandler(EsphomeCommandWebSocket):
|
||||
def build_command(self, message):
|
||||
js = json.loads(message)
|
||||
config_file = os.path.join(CONFIG_DIR, js['configuration'])
|
||||
return ["esphome", "--dashboard", config_file, "run", '--upload-port', js["port"]]
|
||||
|
||||
|
||||
class EsphomeCompileHandler(EsphomeCommandWebSocket):
|
||||
def build_command(self, message):
|
||||
js = json.loads(message)
|
||||
config_file = os.path.join(CONFIG_DIR, js['configuration'])
|
||||
return ["esphome", "--dashboard", config_file, "compile"]
|
||||
|
||||
|
||||
class EsphomeValidateHandler(EsphomeCommandWebSocket):
|
||||
def build_command(self, message):
|
||||
js = json.loads(message)
|
||||
config_file = os.path.join(CONFIG_DIR, js['configuration'])
|
||||
return ["esphome", "--dashboard", config_file, "config"]
|
||||
|
||||
|
||||
class EsphomeCleanMqttHandler(EsphomeCommandWebSocket):
|
||||
def build_command(self, message):
|
||||
js = json.loads(message)
|
||||
config_file = os.path.join(CONFIG_DIR, js['configuration'])
|
||||
return ["esphome", "--dashboard", config_file, "clean-mqtt"]
|
||||
|
||||
|
||||
class EsphomeCleanHandler(EsphomeCommandWebSocket):
|
||||
def build_command(self, message):
|
||||
js = json.loads(message)
|
||||
config_file = os.path.join(CONFIG_DIR, js['configuration'])
|
||||
return ["esphome", "--dashboard", config_file, "clean"]
|
||||
|
||||
|
||||
class EsphomeHassConfigHandler(EsphomeCommandWebSocket):
|
||||
def build_command(self, message):
|
||||
js = json.loads(message)
|
||||
config_file = os.path.join(CONFIG_DIR, js['configuration'])
|
||||
return ["esphome", "--dashboard", config_file, "hass-config"]
|
||||
|
||||
|
||||
class SerialPortRequestHandler(BaseHandler):
|
||||
def get(self):
|
||||
if not self.is_authenticated():
|
||||
self.redirect(RELATIVE_URL + 'login')
|
||||
return
|
||||
ports = get_serial_ports()
|
||||
data = []
|
||||
for port, desc in ports:
|
||||
if port == '/dev/ttyAMA0':
|
||||
desc = 'UART pins on GPIO header'
|
||||
split_desc = desc.split(' - ')
|
||||
if len(split_desc) == 2 and split_desc[0] == split_desc[1]:
|
||||
# Some serial ports repeat their values
|
||||
desc = split_desc[0]
|
||||
data.append({'port': port, 'desc': desc})
|
||||
data.append({'port': 'OTA', 'desc': 'Over-The-Air'})
|
||||
data.sort(key=lambda x: x['port'], reverse=True)
|
||||
self.write(json.dumps(data))
|
||||
|
||||
|
||||
class WizardRequestHandler(BaseHandler):
|
||||
def post(self):
|
||||
from esphome import wizard
|
||||
|
||||
if not self.is_authenticated():
|
||||
self.redirect(RELATIVE_URL + 'login')
|
||||
return
|
||||
kwargs = {k: ''.join(v) for k, v in self.request.arguments.items()}
|
||||
destination = os.path.join(CONFIG_DIR, kwargs['name'] + '.yaml')
|
||||
wizard.wizard_write(path=destination, **kwargs)
|
||||
self.redirect('/?begin=True')
|
||||
|
||||
|
||||
class DownloadBinaryRequestHandler(BaseHandler):
|
||||
def get(self):
|
||||
if not self.is_authenticated():
|
||||
self.redirect(RELATIVE_URL + 'login')
|
||||
return
|
||||
|
||||
# pylint: disable=no-value-for-parameter
|
||||
configuration = self.get_argument('configuration')
|
||||
storage_path = ext_storage_path(CONFIG_DIR, configuration)
|
||||
storage_json = StorageJSON.load(storage_path)
|
||||
if storage_json is None:
|
||||
self.send_error()
|
||||
return
|
||||
|
||||
path = storage_json.firmware_bin_path
|
||||
self.set_header('Content-Type', 'application/octet-stream')
|
||||
filename = '{}.bin'.format(storage_json.name)
|
||||
self.set_header("Content-Disposition", 'attachment; filename="{}"'.format(filename))
|
||||
with open(path, 'rb') as f:
|
||||
while True:
|
||||
data = f.read(16384)
|
||||
if not data:
|
||||
break
|
||||
self.write(data)
|
||||
self.finish()
|
||||
|
||||
|
||||
def _list_yaml_files():
|
||||
files = []
|
||||
for file in os.listdir(CONFIG_DIR):
|
||||
if not file.endswith('.yaml'):
|
||||
continue
|
||||
if file.startswith('.'):
|
||||
continue
|
||||
if file == 'secrets.yaml':
|
||||
continue
|
||||
files.append(file)
|
||||
files.sort()
|
||||
return files
|
||||
|
||||
|
||||
def _list_dashboard_entries():
|
||||
files = _list_yaml_files()
|
||||
return [DashboardEntry(file) for file in files]
|
||||
|
||||
|
||||
class DashboardEntry(object):
|
||||
def __init__(self, filename):
|
||||
self.filename = filename
|
||||
self._storage = None
|
||||
self._loaded_storage = False
|
||||
|
||||
@property
|
||||
def full_path(self): # type: () -> str
|
||||
return os.path.join(CONFIG_DIR, self.filename)
|
||||
|
||||
@property
|
||||
def storage(self): # type: () -> Optional[StorageJSON]
|
||||
if not self._loaded_storage:
|
||||
self._storage = StorageJSON.load(ext_storage_path(CONFIG_DIR, self.filename))
|
||||
self._loaded_storage = True
|
||||
return self._storage
|
||||
|
||||
@property
|
||||
def address(self):
|
||||
if self.storage is None:
|
||||
return None
|
||||
return self.storage.address
|
||||
|
||||
@property
|
||||
def name(self):
|
||||
if self.storage is None:
|
||||
return self.filename[:-len('.yaml')]
|
||||
return self.storage.name
|
||||
|
||||
@property
|
||||
def esp_platform(self):
|
||||
if self.storage is None:
|
||||
return None
|
||||
return self.storage.esp_platform
|
||||
|
||||
@property
|
||||
def board(self):
|
||||
if self.storage is None:
|
||||
return None
|
||||
return self.storage.board
|
||||
|
||||
@property
|
||||
def update_available(self):
|
||||
if self.storage is None:
|
||||
return True
|
||||
return self.update_old != self.update_new
|
||||
|
||||
@property
|
||||
def update_old(self):
|
||||
if self.storage is None:
|
||||
return ''
|
||||
return self.storage.esphome_version or ''
|
||||
|
||||
@property
|
||||
def update_new(self):
|
||||
return const.__version__
|
||||
|
||||
|
||||
class MainRequestHandler(BaseHandler):
|
||||
def get(self):
|
||||
if not self.is_authenticated():
|
||||
self.redirect(RELATIVE_URL + 'login')
|
||||
return
|
||||
|
||||
begin = bool(self.get_argument('begin', False))
|
||||
entries = _list_dashboard_entries()
|
||||
|
||||
self.render("templates/index.html", entries=entries, begin=begin,
|
||||
**template_args())
|
||||
|
||||
|
||||
def _ping_func(filename, address):
|
||||
if os.name == 'nt':
|
||||
command = ['ping', '-n', '1', address]
|
||||
else:
|
||||
command = ['ping', '-c', '1', address]
|
||||
rc, _, _ = run_system_command(*command)
|
||||
return filename, rc == 0
|
||||
|
||||
|
||||
class MDNSStatusThread(threading.Thread):
|
||||
def run(self):
|
||||
zc = Zeroconf()
|
||||
|
||||
def on_update(dat):
|
||||
for key, b in dat.items():
|
||||
PING_RESULT[key] = b
|
||||
|
||||
stat = DashboardStatus(zc, on_update)
|
||||
stat.start()
|
||||
while not STOP_EVENT.is_set():
|
||||
entries = _list_dashboard_entries()
|
||||
stat.request_query({entry.filename: entry.name + '.local.' for entry in entries})
|
||||
|
||||
PING_REQUEST.wait()
|
||||
PING_REQUEST.clear()
|
||||
stat.stop()
|
||||
stat.join()
|
||||
zc.close()
|
||||
|
||||
|
||||
class PingStatusThread(threading.Thread):
|
||||
def run(self):
|
||||
pool = multiprocessing.Pool(processes=8)
|
||||
while not STOP_EVENT.is_set():
|
||||
# Only do pings if somebody has the dashboard open
|
||||
|
||||
def callback(ret):
|
||||
PING_RESULT[ret[0]] = ret[1]
|
||||
|
||||
entries = _list_dashboard_entries()
|
||||
queue = collections.deque()
|
||||
for entry in entries:
|
||||
if entry.address is None:
|
||||
PING_RESULT[entry.filename] = None
|
||||
continue
|
||||
|
||||
result = pool.apply_async(_ping_func, (entry.filename, entry.address),
|
||||
callback=callback)
|
||||
queue.append(result)
|
||||
|
||||
while queue:
|
||||
item = queue[0]
|
||||
if item.ready():
|
||||
queue.popleft()
|
||||
continue
|
||||
|
||||
try:
|
||||
item.get(0.1)
|
||||
except OSError:
|
||||
# ping not installed
|
||||
pass
|
||||
except multiprocessing.TimeoutError:
|
||||
pass
|
||||
|
||||
if STOP_EVENT.is_set():
|
||||
pool.terminate()
|
||||
return
|
||||
|
||||
PING_REQUEST.wait()
|
||||
PING_REQUEST.clear()
|
||||
|
||||
|
||||
class PingRequestHandler(BaseHandler):
|
||||
def get(self):
|
||||
if not self.is_authenticated():
|
||||
self.redirect(RELATIVE_URL + 'login')
|
||||
return
|
||||
|
||||
PING_REQUEST.set()
|
||||
self.write(json.dumps(PING_RESULT))
|
||||
|
||||
|
||||
def is_allowed(configuration):
|
||||
return os.path.sep not in configuration
|
||||
|
||||
|
||||
class EditRequestHandler(BaseHandler):
|
||||
def get(self):
|
||||
if not self.is_authenticated():
|
||||
self.redirect(RELATIVE_URL + 'login')
|
||||
return
|
||||
# pylint: disable=no-value-for-parameter
|
||||
configuration = self.get_argument('configuration')
|
||||
if not is_allowed(configuration):
|
||||
self.set_status(401)
|
||||
return
|
||||
|
||||
with open(os.path.join(CONFIG_DIR, configuration), 'r') as f:
|
||||
content = f.read()
|
||||
self.write(content)
|
||||
|
||||
def post(self):
|
||||
if not self.is_authenticated():
|
||||
self.redirect(RELATIVE_URL + 'login')
|
||||
return
|
||||
# pylint: disable=no-value-for-parameter
|
||||
configuration = self.get_argument('configuration')
|
||||
if not is_allowed(configuration):
|
||||
self.set_status(401)
|
||||
return
|
||||
|
||||
with open(os.path.join(CONFIG_DIR, configuration), 'wb') as f:
|
||||
f.write(self.request.body)
|
||||
self.set_status(200)
|
||||
return
|
||||
|
||||
|
||||
PING_RESULT = {} # type: dict
|
||||
STOP_EVENT = threading.Event()
|
||||
PING_REQUEST = threading.Event()
|
||||
|
||||
|
||||
class LoginHandler(BaseHandler):
|
||||
def get(self):
|
||||
if USING_HASSIO_AUTH:
|
||||
self.render_hassio_login()
|
||||
return
|
||||
self.write('<html><body><form action="' + RELATIVE_URL + 'login" method="post">'
|
||||
'Password: <input type="password" name="password">'
|
||||
'<input type="submit" value="Sign in">'
|
||||
'</form></body></html>')
|
||||
|
||||
def render_hassio_login(self, error=None):
|
||||
self.render("templates/login.html", error=error, **template_args())
|
||||
|
||||
def post_hassio_login(self):
|
||||
import requests
|
||||
|
||||
headers = {
|
||||
'X-HASSIO-KEY': os.getenv('HASSIO_TOKEN'),
|
||||
}
|
||||
data = {
|
||||
'username': str(self.get_argument('username', '')),
|
||||
'password': str(self.get_argument('password', ''))
|
||||
}
|
||||
try:
|
||||
req = requests.post('http://hassio/auth', headers=headers, data=data)
|
||||
if req.status_code == 200:
|
||||
self.set_secure_cookie("authenticated", cookie_authenticated_yes)
|
||||
self.redirect('/')
|
||||
return
|
||||
except Exception as err: # pylint: disable=broad-except
|
||||
_LOGGER.warning("Error during Hass.io auth request: %s", err)
|
||||
self.set_status(500)
|
||||
self.render_hassio_login(error="Internal server error")
|
||||
return
|
||||
self.set_status(401)
|
||||
self.render_hassio_login(error="Invalid username or password")
|
||||
|
||||
def post(self):
|
||||
if USING_HASSIO_AUTH:
|
||||
self.post_hassio_login()
|
||||
return
|
||||
|
||||
password = str(self.get_argument("password", ''))
|
||||
if IS_PY2:
|
||||
password = hmac.new(password).digest()
|
||||
else:
|
||||
password = hmac.new(password.encode()).digest()
|
||||
if hmac.compare_digest(PASSWORD_DIGEST, password):
|
||||
self.set_secure_cookie("authenticated", cookie_authenticated_yes)
|
||||
self.redirect("/")
|
||||
|
||||
|
||||
_STATIC_FILE_HASHES = {}
|
||||
|
||||
|
||||
def get_static_file_url(name):
|
||||
static_path = os.path.join(os.path.dirname(__file__), 'static')
|
||||
if name in _STATIC_FILE_HASHES:
|
||||
hash_ = _STATIC_FILE_HASHES[name]
|
||||
else:
|
||||
path = os.path.join(static_path, name)
|
||||
with open(path, 'rb') as f_handle:
|
||||
hash_ = hash(f_handle.read()) & (2**32-1)
|
||||
_STATIC_FILE_HASHES[name] = hash_
|
||||
return RELATIVE_URL + u'static/{}?hash={:08X}'.format(name, hash_)
|
||||
|
||||
|
||||
def make_app(debug=False):
|
||||
def log_function(handler):
|
||||
if handler.get_status() < 400:
|
||||
log_method = access_log.info
|
||||
|
||||
if isinstance(handler, SerialPortRequestHandler) and not debug:
|
||||
return
|
||||
if isinstance(handler, PingRequestHandler) and not debug:
|
||||
return
|
||||
elif handler.get_status() < 500:
|
||||
log_method = access_log.warning
|
||||
else:
|
||||
log_method = access_log.error
|
||||
|
||||
request_time = 1000.0 * handler.request.request_time()
|
||||
# pylint: disable=protected-access
|
||||
log_method("%d %s %.2fms", handler.get_status(),
|
||||
handler._request_summary(), request_time)
|
||||
|
||||
class StaticFileHandler(tornado.web.StaticFileHandler):
|
||||
def set_extra_headers(self, path):
|
||||
if debug:
|
||||
self.set_header('Cache-Control', 'no-store, no-cache, must-revalidate, max-age=0')
|
||||
|
||||
static_path = os.path.join(os.path.dirname(__file__), 'static')
|
||||
settings = {
|
||||
'debug': debug,
|
||||
'cookie_secret': COOKIE_SECRET,
|
||||
'log_function': log_function,
|
||||
'websocket_ping_interval': 30.0,
|
||||
}
|
||||
app = tornado.web.Application([
|
||||
(RELATIVE_URL + "", MainRequestHandler),
|
||||
(RELATIVE_URL + "login", LoginHandler),
|
||||
(RELATIVE_URL + "logs", EsphomeLogsHandler),
|
||||
(RELATIVE_URL + "run", EsphomeRunHandler),
|
||||
(RELATIVE_URL + "compile", EsphomeCompileHandler),
|
||||
(RELATIVE_URL + "validate", EsphomeValidateHandler),
|
||||
(RELATIVE_URL + "clean-mqtt", EsphomeCleanMqttHandler),
|
||||
(RELATIVE_URL + "clean", EsphomeCleanHandler),
|
||||
(RELATIVE_URL + "hass-config", EsphomeHassConfigHandler),
|
||||
(RELATIVE_URL + "edit", EditRequestHandler),
|
||||
(RELATIVE_URL + "download.bin", DownloadBinaryRequestHandler),
|
||||
(RELATIVE_URL + "serial-ports", SerialPortRequestHandler),
|
||||
(RELATIVE_URL + "ping", PingRequestHandler),
|
||||
(RELATIVE_URL + "wizard.html", WizardRequestHandler),
|
||||
(RELATIVE_URL + r"static/(.*)", StaticFileHandler, {'path': static_path}),
|
||||
], **settings)
|
||||
|
||||
if debug:
|
||||
_STATIC_FILE_HASHES.clear()
|
||||
|
||||
return app
|
||||
|
||||
|
||||
def start_web_server(args):
|
||||
global CONFIG_DIR
|
||||
global PASSWORD_DIGEST
|
||||
global USING_PASSWORD
|
||||
global ON_HASSIO
|
||||
global USING_HASSIO_AUTH
|
||||
global COOKIE_SECRET
|
||||
|
||||
CONFIG_DIR = args.configuration
|
||||
mkdir_p(CONFIG_DIR)
|
||||
mkdir_p(os.path.join(CONFIG_DIR, ".esphome"))
|
||||
|
||||
ON_HASSIO = args.hassio
|
||||
if ON_HASSIO:
|
||||
USING_HASSIO_AUTH = not get_bool_env('DISABLE_HA_AUTHENTICATION')
|
||||
USING_PASSWORD = False
|
||||
else:
|
||||
USING_HASSIO_AUTH = False
|
||||
USING_PASSWORD = args.password
|
||||
|
||||
if USING_PASSWORD:
|
||||
if IS_PY2:
|
||||
PASSWORD_DIGEST = hmac.new(args.password).digest()
|
||||
else:
|
||||
PASSWORD_DIGEST = hmac.new(args.password.encode()).digest()
|
||||
|
||||
if USING_HASSIO_AUTH or USING_PASSWORD:
|
||||
path = esphome_storage_path(CONFIG_DIR)
|
||||
storage = EsphomeStorageJSON.load(path)
|
||||
if storage is None:
|
||||
storage = EsphomeStorageJSON.get_default()
|
||||
storage.save(path)
|
||||
COOKIE_SECRET = storage.cookie_secret
|
||||
|
||||
app = make_app(args.verbose)
|
||||
if args.socket is not None:
|
||||
_LOGGER.info("Starting dashboard web server on unix socket %s and configuration dir %s...",
|
||||
args.socket, CONFIG_DIR)
|
||||
server = tornado.httpserver.HTTPServer(app)
|
||||
socket = tornado.netutil.bind_unix_socket(args.socket, mode=0o666)
|
||||
server.add_socket(socket)
|
||||
else:
|
||||
_LOGGER.info("Starting dashboard web server on port %s and configuration dir %s...",
|
||||
args.port, CONFIG_DIR)
|
||||
app.listen(args.port)
|
||||
|
||||
if args.open_ui:
|
||||
import webbrowser
|
||||
|
||||
webbrowser.open('localhost:{}'.format(args.port))
|
||||
|
||||
if STATUS_USE_PING:
|
||||
status_thread = PingStatusThread()
|
||||
else:
|
||||
status_thread = MDNSStatusThread()
|
||||
status_thread.start()
|
||||
try:
|
||||
tornado.ioloop.IOLoop.current().start()
|
||||
except KeyboardInterrupt:
|
||||
_LOGGER.info("Shutting down...")
|
||||
STOP_EVENT.set()
|
||||
PING_REQUEST.set()
|
||||
status_thread.join()
|
||||
if args.socket is not None:
|
||||
os.remove(args.socket)
|
||||
File diff suppressed because one or more lines are too long
@@ -1,244 +0,0 @@
|
||||
nav .brand-logo {
|
||||
margin-left: 48px;
|
||||
font-size: 20px;
|
||||
}
|
||||
|
||||
main .container {
|
||||
margin-top: -12vh;
|
||||
flex-shrink: 0;
|
||||
}
|
||||
|
||||
.ribbon {
|
||||
width: 100%;
|
||||
height: 17vh;
|
||||
background-color: #3F51B5;
|
||||
flex-shrink: 0;
|
||||
}
|
||||
|
||||
.ribbon-fab:not(.tap-target-origin) {
|
||||
position: absolute;
|
||||
right: 24px;
|
||||
top: calc(17vh + 34px);
|
||||
}
|
||||
|
||||
i.very-large {
|
||||
font-size: 8rem;
|
||||
padding-top: 2px;
|
||||
color: #424242;
|
||||
}
|
||||
|
||||
.card .card-content {
|
||||
padding-left: 18px;
|
||||
padding-bottom: 10px;
|
||||
}
|
||||
|
||||
.card-action a, .card-dropdown-action a {
|
||||
cursor: pointer;
|
||||
}
|
||||
|
||||
.inlinecode {
|
||||
box-sizing: border-box;
|
||||
padding: 0.2em 0.4em;
|
||||
margin: 0;
|
||||
font-size: 85%;
|
||||
background-color: rgba(27,31,35,0.05);
|
||||
border-radius: 3px;
|
||||
font-family: "SFMono-Regular",Consolas,"Liberation Mono",Menlo,Courier,monospace;
|
||||
}
|
||||
|
||||
.log {
|
||||
max-height: calc(100% - 56px);
|
||||
background-color: #1c1c1c;
|
||||
margin-top: 0;
|
||||
margin-bottom: 0;
|
||||
font-family: "SFMono-Regular", Consolas, "Liberation Mono", Menlo, Courier, monospace;
|
||||
font-size: 12px;
|
||||
padding: 16px;
|
||||
overflow: auto;
|
||||
line-height: 1.45;
|
||||
border-radius: 3px;
|
||||
white-space: pre-wrap;
|
||||
overflow-wrap: break-word;
|
||||
color: #DDD;
|
||||
}
|
||||
|
||||
.log-bold { font-weight: bold; }
|
||||
.log-italic { font-style: italic; }
|
||||
.log-underline { text-decoration: underline; }
|
||||
.log-strikethrough { text-decoration: line-through; }
|
||||
.log-underline.log-strikethrough { text-decoration: underline line-through; }
|
||||
.log-secret {
|
||||
-webkit-user-select: none;
|
||||
-moz-user-select: none;
|
||||
-ms-user-select: none;
|
||||
user-select: none;
|
||||
}
|
||||
.log-secret-redacted {
|
||||
opacity: 0;
|
||||
width: 1px;
|
||||
font-size: 1px;
|
||||
}
|
||||
.log-fg-black { color: rgb(128,128,128); }
|
||||
.log-fg-red { color: rgb(255,0,0); }
|
||||
.log-fg-green { color: rgb(0,255,0); }
|
||||
.log-fg-yellow { color: rgb(255,255,0); }
|
||||
.log-fg-blue { color: rgb(0,0,255); }
|
||||
.log-fg-magenta { color: rgb(255,0,255); }
|
||||
.log-fg-cyan { color: rgb(0,255,255); }
|
||||
.log-fg-white { color: rgb(187,187,187); }
|
||||
.log-bg-black { background-color: rgb(0,0,0); }
|
||||
.log-bg-red { background-color: rgb(255,0,0); }
|
||||
.log-bg-green { background-color: rgb(0,255,0); }
|
||||
.log-bg-yellow { background-color: rgb(255,255,0); }
|
||||
.log-bg-blue { background-color: rgb(0,0,255); }
|
||||
.log-bg-magenta { background-color: rgb(255,0,255); }
|
||||
.log-bg-cyan { background-color: rgb(0,255,255); }
|
||||
.log-bg-white { background-color: rgb(255,255,255); }
|
||||
|
||||
.modal {
|
||||
width: 95%;
|
||||
max-height: 90%;
|
||||
height: 85% !important;
|
||||
}
|
||||
|
||||
.page-footer {
|
||||
padding-top: 0;
|
||||
}
|
||||
|
||||
body {
|
||||
display: flex;
|
||||
min-height: 100vh;
|
||||
flex-direction: column;
|
||||
}
|
||||
|
||||
main {
|
||||
flex: 1 0 auto;
|
||||
}
|
||||
|
||||
ul.browser-default {
|
||||
padding-left: 30px;
|
||||
margin-top: 10px;
|
||||
margin-bottom: 15px;
|
||||
}
|
||||
|
||||
ul.browser-default li {
|
||||
list-style-type: initial;
|
||||
}
|
||||
|
||||
ul.stepper:not(.horizontal) .step.active::before, ul.stepper:not(.horizontal) .step.done::before, ul.stepper.horizontal .step.active .step-title::before, ul.stepper.horizontal .step.done .step-title::before {
|
||||
background-color: #3f51b5 !important;
|
||||
}
|
||||
|
||||
.select-port-container {
|
||||
margin-top: 8px;
|
||||
margin-right: 24px;
|
||||
width: 350px;
|
||||
}
|
||||
|
||||
.select-port-container .select-dropdown {
|
||||
color: #fff;
|
||||
}
|
||||
|
||||
.select-port-container .caret {
|
||||
fill: #fff;
|
||||
}
|
||||
|
||||
.dropdown-trigger {
|
||||
cursor: pointer;
|
||||
}
|
||||
|
||||
.select-action {
|
||||
width: auto !important;
|
||||
height: auto !important;
|
||||
white-space: nowrap;
|
||||
}
|
||||
|
||||
.tap-target-wrapper {
|
||||
position: fixed !important;
|
||||
}
|
||||
|
||||
/* https://github.com/tnhu/status-indicator/blob/master/styles.css */
|
||||
.status-indicator .status-indicator-icon {
|
||||
display: inline-block;
|
||||
border-radius: 50%;
|
||||
width: 10px;
|
||||
height: 10px;
|
||||
}
|
||||
|
||||
.status-indicator.unknown .status-indicator-icon {
|
||||
background-color: rgb(216, 226, 233);
|
||||
}
|
||||
|
||||
.status-indicator.unknown .status-indicator-text::after {
|
||||
content: "Unknown status";
|
||||
}
|
||||
|
||||
.status-indicator.offline .status-indicator-icon {
|
||||
background-color: rgb(255, 77, 77);
|
||||
}
|
||||
|
||||
.status-indicator.offline .status-indicator-text::after {
|
||||
content: "Offline";
|
||||
}
|
||||
|
||||
.status-indicator.not-responding .status-indicator-icon {
|
||||
background-color: rgb(255, 170, 0);
|
||||
}
|
||||
|
||||
.status-indicator.not-responding .status-indicator-text::after {
|
||||
content: "Not Responding";
|
||||
}
|
||||
|
||||
@keyframes status-indicator-pulse-online {
|
||||
0% {
|
||||
box-shadow: 0 0 0 0 rgba(75, 210, 143, .5);
|
||||
}
|
||||
25% {
|
||||
box-shadow: 0 0 0 10px rgba(75, 210, 143, 0);
|
||||
}
|
||||
30% {
|
||||
box-shadow: 0 0 0 0 rgba(75, 210, 143, 0);
|
||||
}
|
||||
}
|
||||
|
||||
.status-indicator.online .status-indicator-icon {
|
||||
background-color: rgb(75, 210, 143);
|
||||
animation-duration: 5s;
|
||||
animation-timing-function: ease-in-out;
|
||||
animation-iteration-count: infinite;
|
||||
animation-direction: normal;
|
||||
animation-delay: 0s;
|
||||
animation-fill-mode: none;
|
||||
animation-name: status-indicator-pulse-online;
|
||||
}
|
||||
|
||||
.status-indicator.online .status-indicator-text::after {
|
||||
content: "Online";
|
||||
}
|
||||
|
||||
#editor {
|
||||
margin-top: 0;
|
||||
margin-bottom: 0;
|
||||
border-radius: 3px;
|
||||
height: calc(100% - 56px);
|
||||
}
|
||||
|
||||
.update-available i {
|
||||
vertical-align: bottom;
|
||||
font-size: 20px !important;
|
||||
color: #3F51B5 !important;
|
||||
margin-right: -4.5px;
|
||||
margin-left: -5.5px;
|
||||
}
|
||||
|
||||
.flash-using-esphomeflasher {
|
||||
vertical-align: middle;
|
||||
color: #666 !important;
|
||||
}
|
||||
|
||||
.error {
|
||||
background: #e53935;
|
||||
color: #fff;
|
||||
padding: 10px 15px;
|
||||
margin-top: 15px;
|
||||
}
|
||||
@@ -1,717 +0,0 @@
|
||||
// Disclaimer: This file was written in a hurry and by someone
|
||||
// who does not know JS at all. This file desperately needs cleanup.
|
||||
document.addEventListener('DOMContentLoaded', () => {
|
||||
M.AutoInit(document.body);
|
||||
});
|
||||
|
||||
const initializeColorState = () => {
|
||||
return {
|
||||
bold: false,
|
||||
italic: false,
|
||||
underline: false,
|
||||
strikethrough: false,
|
||||
foregroundColor: false,
|
||||
backgroundColor: false,
|
||||
carriageReturn: false,
|
||||
secret: false,
|
||||
};
|
||||
};
|
||||
|
||||
const colorReplace = (pre, state, text) => {
|
||||
const re = /(?:\033|\\033)(?:\[(.*?)[@-~]|\].*?(?:\007|\033\\))/g;
|
||||
let i = 0;
|
||||
|
||||
if (state.carriageReturn) {
|
||||
if (text !== "\n") {
|
||||
// don't remove if \r\n
|
||||
pre.removeChild(pre.lastChild);
|
||||
}
|
||||
state.carriageReturn = false;
|
||||
}
|
||||
|
||||
if (text.includes("\r")) {
|
||||
state.carriageReturn = true;
|
||||
}
|
||||
|
||||
const lineSpan = document.createElement("span");
|
||||
lineSpan.classList.add("line");
|
||||
pre.appendChild(lineSpan);
|
||||
|
||||
const addSpan = (content) => {
|
||||
if (content === "")
|
||||
return;
|
||||
|
||||
const span = document.createElement("span");
|
||||
if (state.bold) span.classList.add("log-bold");
|
||||
if (state.italic) span.classList.add("log-italic");
|
||||
if (state.underline) span.classList.add("log-underline");
|
||||
if (state.strikethrough) span.classList.add("log-strikethrough");
|
||||
if (state.secret) span.classList.add("log-secret");
|
||||
if (state.foregroundColor !== null) span.classList.add(`log-fg-${state.foregroundColor}`);
|
||||
if (state.backgroundColor !== null) span.classList.add(`log-bg-${state.backgroundColor}`);
|
||||
span.appendChild(document.createTextNode(content));
|
||||
lineSpan.appendChild(span);
|
||||
|
||||
if (state.secret) {
|
||||
const redacted = document.createElement("span");
|
||||
redacted.classList.add("log-secret-redacted");
|
||||
redacted.appendChild(document.createTextNode("[redacted]"));
|
||||
lineSpan.appendChild(redacted);
|
||||
}
|
||||
};
|
||||
|
||||
|
||||
while (true) {
|
||||
const match = re.exec(text);
|
||||
if (match === null)
|
||||
break;
|
||||
|
||||
const j = match.index;
|
||||
addSpan(text.substring(i, j));
|
||||
i = j + match[0].length;
|
||||
|
||||
if (match[1] === undefined) continue;
|
||||
|
||||
for (const colorCode of match[1].split(";")) {
|
||||
switch (parseInt(colorCode)) {
|
||||
case 0:
|
||||
// reset
|
||||
state.bold = false;
|
||||
state.italic = false;
|
||||
state.underline = false;
|
||||
state.strikethrough = false;
|
||||
state.foregroundColor = null;
|
||||
state.backgroundColor = null;
|
||||
state.secret = false;
|
||||
break;
|
||||
case 1:
|
||||
state.bold = true;
|
||||
break;
|
||||
case 3:
|
||||
state.italic = true;
|
||||
break;
|
||||
case 4:
|
||||
state.underline = true;
|
||||
break;
|
||||
case 5:
|
||||
state.secret = true;
|
||||
break;
|
||||
case 6:
|
||||
state.secret = false;
|
||||
break;
|
||||
case 9:
|
||||
state.strikethrough = true;
|
||||
break;
|
||||
case 22:
|
||||
state.bold = false;
|
||||
break;
|
||||
case 23:
|
||||
state.italic = false;
|
||||
break;
|
||||
case 24:
|
||||
state.underline = false;
|
||||
break;
|
||||
case 29:
|
||||
state.strikethrough = false;
|
||||
break;
|
||||
case 30:
|
||||
state.foregroundColor = "black";
|
||||
break;
|
||||
case 31:
|
||||
state.foregroundColor = "red";
|
||||
break;
|
||||
case 32:
|
||||
state.foregroundColor = "green";
|
||||
break;
|
||||
case 33:
|
||||
state.foregroundColor = "yellow";
|
||||
break;
|
||||
case 34:
|
||||
state.foregroundColor = "blue";
|
||||
break;
|
||||
case 35:
|
||||
state.foregroundColor = "magenta";
|
||||
break;
|
||||
case 36:
|
||||
state.foregroundColor = "cyan";
|
||||
break;
|
||||
case 37:
|
||||
state.foregroundColor = "white";
|
||||
break;
|
||||
case 39:
|
||||
state.foregroundColor = null;
|
||||
break;
|
||||
case 41:
|
||||
state.backgroundColor = "red";
|
||||
break;
|
||||
case 42:
|
||||
state.backgroundColor = "green";
|
||||
break;
|
||||
case 43:
|
||||
state.backgroundColor = "yellow";
|
||||
break;
|
||||
case 44:
|
||||
state.backgroundColor = "blue";
|
||||
break;
|
||||
case 45:
|
||||
state.backgroundColor = "magenta";
|
||||
break;
|
||||
case 46:
|
||||
state.backgroundColor = "cyan";
|
||||
break;
|
||||
case 47:
|
||||
state.backgroundColor = "white";
|
||||
break;
|
||||
case 40:
|
||||
case 49:
|
||||
state.backgroundColor = null;
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
addSpan(text.substring(i));
|
||||
scrollToBottomOfElement(pre);
|
||||
};
|
||||
|
||||
const removeUpdateAvailable = (filename) => {
|
||||
const p = document.querySelector(`.update-available[data-node="${filename}"]`);
|
||||
if (p === undefined)
|
||||
return;
|
||||
p.remove();
|
||||
};
|
||||
|
||||
let configuration = "";
|
||||
let wsProtocol = "ws:";
|
||||
if (window.location.protocol === "https:") {
|
||||
wsProtocol = 'wss:';
|
||||
}
|
||||
const wsUrl = `${wsProtocol}//${window.location.hostname}:${window.location.port}${relative_url}`;
|
||||
|
||||
let isFetchingPing = false;
|
||||
const fetchPing = () => {
|
||||
if (isFetchingPing)
|
||||
return;
|
||||
isFetchingPing = true;
|
||||
|
||||
fetch(`${relative_url}ping`, {credentials: "same-origin"}).then(res => res.json())
|
||||
.then(response => {
|
||||
for (let filename in response) {
|
||||
let node = document.querySelector(`.status-indicator[data-node="${filename}"]`);
|
||||
if (node === null)
|
||||
continue;
|
||||
|
||||
let status = response[filename];
|
||||
let klass;
|
||||
if (status === null) {
|
||||
klass = 'unknown';
|
||||
} else if (status === true) {
|
||||
klass = 'online';
|
||||
node.setAttribute('data-last-connected', Date.now().toString());
|
||||
} else if (node.hasAttribute('data-last-connected')) {
|
||||
const attr = parseInt(node.getAttribute('data-last-connected'));
|
||||
if (Date.now() - attr <= 5000) {
|
||||
klass = 'not-responding';
|
||||
} else {
|
||||
klass = 'offline';
|
||||
}
|
||||
} else {
|
||||
klass = 'offline';
|
||||
}
|
||||
|
||||
if (node.classList.contains(klass))
|
||||
continue;
|
||||
|
||||
node.classList.remove('unknown', 'online', 'offline', 'not-responding');
|
||||
node.classList.add(klass);
|
||||
}
|
||||
|
||||
isFetchingPing = false;
|
||||
});
|
||||
};
|
||||
setInterval(fetchPing, 2000);
|
||||
fetchPing();
|
||||
|
||||
const portSelect = document.querySelector('.nav-wrapper select');
|
||||
let ports = [];
|
||||
|
||||
const fetchSerialPorts = (begin=false) => {
|
||||
fetch(`${relative_url}serial-ports`, {credentials: "same-origin"}).then(res => res.json())
|
||||
.then(response => {
|
||||
if (ports.length === response.length) {
|
||||
let allEqual = true;
|
||||
for (let i = 0; i < response.length; i++) {
|
||||
if (ports[i].port !== response[i].port) {
|
||||
allEqual = false;
|
||||
break;
|
||||
}
|
||||
}
|
||||
if (allEqual)
|
||||
return;
|
||||
}
|
||||
const hasNewPort = response.length >= ports.length;
|
||||
|
||||
ports = response;
|
||||
|
||||
const inst = M.FormSelect.getInstance(portSelect);
|
||||
if (inst !== undefined) {
|
||||
inst.destroy();
|
||||
}
|
||||
|
||||
portSelect.innerHTML = "";
|
||||
const prevSelected = getUploadPort();
|
||||
for (let i = 0; i < response.length; i++) {
|
||||
const val = response[i];
|
||||
if (val.port === prevSelected) {
|
||||
portSelect.innerHTML += `<option value="${val.port}" selected>${val.port} (${val.desc})</option>`;
|
||||
} else {
|
||||
portSelect.innerHTML += `<option value="${val.port}">${val.port} (${val.desc})</option>`;
|
||||
}
|
||||
}
|
||||
|
||||
M.FormSelect.init(portSelect, {});
|
||||
if (!begin && hasNewPort)
|
||||
M.toast({html: "Discovered new serial port."});
|
||||
});
|
||||
};
|
||||
|
||||
const getUploadPort = () => {
|
||||
const inst = M.FormSelect.getInstance(portSelect);
|
||||
if (inst === undefined) {
|
||||
return "OTA";
|
||||
}
|
||||
|
||||
inst._setSelectedStates();
|
||||
return inst.getSelectedValues()[0];
|
||||
};
|
||||
setInterval(fetchSerialPorts, 5000);
|
||||
fetchSerialPorts(true);
|
||||
|
||||
const logsModalElem = document.getElementById("modal-logs");
|
||||
|
||||
document.querySelectorAll(".action-show-logs").forEach((showLogs) => {
|
||||
showLogs.addEventListener('click', (e) => {
|
||||
configuration = e.target.getAttribute('data-node');
|
||||
const modalInstance = M.Modal.getInstance(logsModalElem);
|
||||
const log = logsModalElem.querySelector(".log");
|
||||
log.innerHTML = "";
|
||||
const colorState = initializeColorState();
|
||||
const stopLogsButton = logsModalElem.querySelector(".stop-logs");
|
||||
let stopped = false;
|
||||
stopLogsButton.innerHTML = "Stop";
|
||||
modalInstance.open();
|
||||
|
||||
const filenameField = logsModalElem.querySelector('.filename');
|
||||
filenameField.innerHTML = configuration;
|
||||
|
||||
const logSocket = new WebSocket(wsUrl + "logs");
|
||||
logSocket.addEventListener('message', (event) => {
|
||||
const data = JSON.parse(event.data);
|
||||
if (data.event === "line") {
|
||||
colorReplace(log, colorState, data.data);
|
||||
} else if (data.event === "exit") {
|
||||
if (data.code === 0) {
|
||||
M.toast({html: "Program exited successfully."});
|
||||
} else {
|
||||
M.toast({html: `Program failed with code ${data.code}`});
|
||||
}
|
||||
|
||||
stopLogsButton.innerHTML = "Close";
|
||||
stopped = true;
|
||||
}
|
||||
});
|
||||
logSocket.addEventListener('open', () => {
|
||||
const msg = JSON.stringify({configuration: configuration, port: getUploadPort()});
|
||||
logSocket.send(msg);
|
||||
});
|
||||
logSocket.addEventListener('close', () => {
|
||||
if (!stopped) {
|
||||
M.toast({html: 'Terminated process.'});
|
||||
}
|
||||
});
|
||||
modalInstance.options.onCloseStart = () => {
|
||||
logSocket.close();
|
||||
};
|
||||
});
|
||||
});
|
||||
|
||||
const uploadModalElem = document.getElementById("modal-upload");
|
||||
|
||||
document.querySelectorAll(".action-upload").forEach((upload) => {
|
||||
upload.addEventListener('click', (e) => {
|
||||
configuration = e.target.getAttribute('data-node');
|
||||
const modalInstance = M.Modal.getInstance(uploadModalElem);
|
||||
modalInstance.options['dismissible'] = false;
|
||||
const log = uploadModalElem.querySelector(".log");
|
||||
log.innerHTML = "";
|
||||
const colorState = initializeColorState();
|
||||
const stopLogsButton = uploadModalElem.querySelector(".stop-logs");
|
||||
let stopped = false;
|
||||
stopLogsButton.innerHTML = "Stop";
|
||||
modalInstance.open();
|
||||
|
||||
const filenameField = uploadModalElem.querySelector('.filename');
|
||||
filenameField.innerHTML = configuration;
|
||||
|
||||
const logSocket = new WebSocket(wsUrl + "run");
|
||||
logSocket.addEventListener('message', (event) => {
|
||||
const data = JSON.parse(event.data);
|
||||
if (data.event === "line") {
|
||||
colorReplace(log, colorState, data.data);
|
||||
} else if (data.event === "exit") {
|
||||
if (data.code === 0) {
|
||||
M.toast({html: "Program exited successfully."});
|
||||
removeUpdateAvailable(configuration);
|
||||
} else {
|
||||
M.toast({html: `Program failed with code ${data.code}`});
|
||||
}
|
||||
|
||||
stopLogsButton.innerHTML = "Close";
|
||||
stopped = true;
|
||||
}
|
||||
});
|
||||
logSocket.addEventListener('open', () => {
|
||||
const msg = JSON.stringify({configuration: configuration, port: getUploadPort()});
|
||||
logSocket.send(msg);
|
||||
});
|
||||
logSocket.addEventListener('close', () => {
|
||||
if (!stopped) {
|
||||
M.toast({html: 'Terminated process.'});
|
||||
}
|
||||
});
|
||||
modalInstance.options.onCloseStart = () => {
|
||||
logSocket.close();
|
||||
};
|
||||
});
|
||||
});
|
||||
|
||||
const validateModalElem = document.getElementById("modal-validate");
|
||||
|
||||
document.querySelectorAll(".action-validate").forEach((upload) => {
|
||||
upload.addEventListener('click', (e) => {
|
||||
configuration = e.target.getAttribute('data-node');
|
||||
const modalInstance = M.Modal.getInstance(validateModalElem);
|
||||
const log = validateModalElem.querySelector(".log");
|
||||
log.innerHTML = "";
|
||||
const colorState = initializeColorState();
|
||||
const stopLogsButton = validateModalElem.querySelector(".stop-logs");
|
||||
let stopped = false;
|
||||
stopLogsButton.innerHTML = "Stop";
|
||||
modalInstance.open();
|
||||
|
||||
const filenameField = validateModalElem.querySelector('.filename');
|
||||
filenameField.innerHTML = configuration;
|
||||
|
||||
const logSocket = new WebSocket(wsUrl + "validate");
|
||||
logSocket.addEventListener('message', (event) => {
|
||||
const data = JSON.parse(event.data);
|
||||
if (data.event === "line") {
|
||||
colorReplace(log, colorState, data.data);
|
||||
} else if (data.event === "exit") {
|
||||
if (data.code === 0) {
|
||||
M.toast({
|
||||
html: `<code class="inlinecode">${configuration}</code> is valid 👍`,
|
||||
displayLength: 5000,
|
||||
});
|
||||
} else {
|
||||
M.toast({
|
||||
html: `<code class="inlinecode">${configuration}</code> is invalid 😕`,
|
||||
displayLength: 5000,
|
||||
});
|
||||
}
|
||||
|
||||
stopLogsButton.innerHTML = "Close";
|
||||
stopped = true;
|
||||
}
|
||||
});
|
||||
logSocket.addEventListener('open', () => {
|
||||
const msg = JSON.stringify({configuration: configuration});
|
||||
logSocket.send(msg);
|
||||
});
|
||||
logSocket.addEventListener('close', () => {
|
||||
if (!stopped) {
|
||||
M.toast({html: 'Terminated process.'});
|
||||
}
|
||||
});
|
||||
modalInstance.options.onCloseStart = () => {
|
||||
logSocket.close();
|
||||
};
|
||||
});
|
||||
});
|
||||
|
||||
const compileModalElem = document.getElementById("modal-compile");
|
||||
const downloadButton = compileModalElem.querySelector('.download-binary');
|
||||
|
||||
document.querySelectorAll(".action-compile").forEach((upload) => {
|
||||
upload.addEventListener('click', (e) => {
|
||||
configuration = e.target.getAttribute('data-node');
|
||||
const modalInstance = M.Modal.getInstance(compileModalElem);
|
||||
modalInstance.options['dismissible'] = false;
|
||||
const log = compileModalElem.querySelector(".log");
|
||||
log.innerHTML = "";
|
||||
const colorState = initializeColorState();
|
||||
const stopLogsButton = compileModalElem.querySelector(".stop-logs");
|
||||
let stopped = false;
|
||||
stopLogsButton.innerHTML = "Stop";
|
||||
downloadButton.classList.add('disabled');
|
||||
|
||||
modalInstance.open();
|
||||
|
||||
const filenameField = compileModalElem.querySelector('.filename');
|
||||
filenameField.innerHTML = configuration;
|
||||
|
||||
const logSocket = new WebSocket(wsUrl + "compile");
|
||||
logSocket.addEventListener('message', (event) => {
|
||||
const data = JSON.parse(event.data);
|
||||
if (data.event === "line") {
|
||||
colorReplace(log, colorState, data.data);
|
||||
} else if (data.event === "exit") {
|
||||
if (data.code === 0) {
|
||||
M.toast({html: "Program exited successfully."});
|
||||
downloadButton.classList.remove('disabled');
|
||||
} else {
|
||||
M.toast({html: `Program failed with code ${data.code}`});
|
||||
}
|
||||
|
||||
stopLogsButton.innerHTML = "Close";
|
||||
stopped = true;
|
||||
}
|
||||
});
|
||||
logSocket.addEventListener('open', () => {
|
||||
const msg = JSON.stringify({configuration: configuration});
|
||||
logSocket.send(msg);
|
||||
});
|
||||
logSocket.addEventListener('close', () => {
|
||||
if (!stopped) {
|
||||
M.toast({html: 'Terminated process.'});
|
||||
}
|
||||
});
|
||||
modalInstance.options.onCloseStart = () => {
|
||||
logSocket.close();
|
||||
};
|
||||
});
|
||||
});
|
||||
|
||||
downloadButton.addEventListener('click', () => {
|
||||
const link = document.createElement("a");
|
||||
link.download = name;
|
||||
link.href = `${relative_url}download.bin?configuration=${encodeURIComponent(configuration)}`;
|
||||
document.body.appendChild(link);
|
||||
link.click();
|
||||
link.remove();
|
||||
});
|
||||
|
||||
const cleanMqttModalElem = document.getElementById("modal-clean-mqtt");
|
||||
|
||||
document.querySelectorAll(".action-clean-mqtt").forEach((btn) => {
|
||||
btn.addEventListener('click', (e) => {
|
||||
configuration = e.target.getAttribute('data-node');
|
||||
const modalInstance = M.Modal.getInstance(cleanMqttModalElem);
|
||||
const log = cleanMqttModalElem.querySelector(".log");
|
||||
log.innerHTML = "";
|
||||
const colorState = initializeColorState();
|
||||
const stopLogsButton = cleanMqttModalElem.querySelector(".stop-logs");
|
||||
let stopped = false;
|
||||
stopLogsButton.innerHTML = "Stop";
|
||||
modalInstance.open();
|
||||
|
||||
const filenameField = cleanMqttModalElem.querySelector('.filename');
|
||||
filenameField.innerHTML = configuration;
|
||||
|
||||
const logSocket = new WebSocket(wsUrl + "clean-mqtt");
|
||||
logSocket.addEventListener('message', (event) => {
|
||||
const data = JSON.parse(event.data);
|
||||
if (data.event === "line") {
|
||||
colorReplace(log, colorState, data.data);
|
||||
} else if (data.event === "exit") {
|
||||
stopLogsButton.innerHTML = "Close";
|
||||
stopped = true;
|
||||
}
|
||||
});
|
||||
logSocket.addEventListener('open', () => {
|
||||
const msg = JSON.stringify({configuration: configuration});
|
||||
logSocket.send(msg);
|
||||
});
|
||||
logSocket.addEventListener('close', () => {
|
||||
if (!stopped) {
|
||||
M.toast({html: 'Terminated process.'});
|
||||
}
|
||||
});
|
||||
modalInstance.options.onCloseStart = () => {
|
||||
logSocket.close();
|
||||
};
|
||||
});
|
||||
});
|
||||
|
||||
const cleanModalElem = document.getElementById("modal-clean");
|
||||
|
||||
document.querySelectorAll(".action-clean").forEach((btn) => {
|
||||
btn.addEventListener('click', (e) => {
|
||||
configuration = e.target.getAttribute('data-node');
|
||||
const modalInstance = M.Modal.getInstance(cleanModalElem);
|
||||
const log = cleanModalElem.querySelector(".log");
|
||||
log.innerHTML = "";
|
||||
const colorState = initializeColorState();
|
||||
const stopLogsButton = cleanModalElem.querySelector(".stop-logs");
|
||||
let stopped = false;
|
||||
stopLogsButton.innerHTML = "Stop";
|
||||
modalInstance.open();
|
||||
|
||||
const filenameField = cleanModalElem.querySelector('.filename');
|
||||
filenameField.innerHTML = configuration;
|
||||
|
||||
const logSocket = new WebSocket(wsUrl + "clean");
|
||||
logSocket.addEventListener('message', (event) => {
|
||||
const data = JSON.parse(event.data);
|
||||
if (data.event === "line") {
|
||||
colorReplace(log, colorState, data.data);
|
||||
} else if (data.event === "exit") {
|
||||
if (data.code === 0) {
|
||||
M.toast({html: "Program exited successfully."});
|
||||
downloadButton.classList.remove('disabled');
|
||||
} else {
|
||||
M.toast({html: `Program failed with code ${data.code}`});
|
||||
}
|
||||
stopLogsButton.innerHTML = "Close";
|
||||
stopped = true;
|
||||
}
|
||||
});
|
||||
logSocket.addEventListener('open', () => {
|
||||
const msg = JSON.stringify({configuration: configuration});
|
||||
logSocket.send(msg);
|
||||
});
|
||||
logSocket.addEventListener('close', () => {
|
||||
if (!stopped) {
|
||||
M.toast({html: 'Terminated process.'});
|
||||
}
|
||||
});
|
||||
modalInstance.options.onCloseStart = () => {
|
||||
logSocket.close();
|
||||
};
|
||||
});
|
||||
});
|
||||
|
||||
const hassConfigModalElem = document.getElementById("modal-hass-config");
|
||||
|
||||
document.querySelectorAll(".action-hass-config").forEach((btn) => {
|
||||
btn.addEventListener('click', (e) => {
|
||||
configuration = e.target.getAttribute('data-node');
|
||||
const modalInstance = M.Modal.getInstance(hassConfigModalElem);
|
||||
const log = hassConfigModalElem.querySelector(".log");
|
||||
log.innerHTML = "";
|
||||
const colorState = initializeColorState();
|
||||
const stopLogsButton = hassConfigModalElem.querySelector(".stop-logs");
|
||||
let stopped = false;
|
||||
stopLogsButton.innerHTML = "Stop";
|
||||
modalInstance.open();
|
||||
|
||||
const filenameField = hassConfigModalElem.querySelector('.filename');
|
||||
filenameField.innerHTML = configuration;
|
||||
|
||||
const logSocket = new WebSocket(wsUrl + "hass-config");
|
||||
logSocket.addEventListener('message', (event) => {
|
||||
const data = JSON.parse(event.data);
|
||||
if (data.event === "line") {
|
||||
colorReplace(log, colorState, data.data);
|
||||
} else if (data.event === "exit") {
|
||||
if (data.code === 0) {
|
||||
M.toast({html: "Program exited successfully."});
|
||||
downloadButton.classList.remove('disabled');
|
||||
} else {
|
||||
M.toast({html: `Program failed with code ${data.code}`});
|
||||
}
|
||||
stopLogsButton.innerHTML = "Close";
|
||||
stopped = true;
|
||||
}
|
||||
});
|
||||
logSocket.addEventListener('open', () => {
|
||||
const msg = JSON.stringify({configuration: configuration});
|
||||
logSocket.send(msg);
|
||||
});
|
||||
logSocket.addEventListener('close', () => {
|
||||
if (!stopped) {
|
||||
M.toast({html: 'Terminated process.'});
|
||||
}
|
||||
});
|
||||
modalInstance.options.onCloseStart = () => {
|
||||
logSocket.close();
|
||||
};
|
||||
});
|
||||
});
|
||||
|
||||
const editModalElem = document.getElementById("modal-editor");
|
||||
const editorElem = editModalElem.querySelector("#editor");
|
||||
const editor = ace.edit(editorElem);
|
||||
editor.setTheme("ace/theme/dreamweaver");
|
||||
editor.session.setMode("ace/mode/yaml");
|
||||
editor.session.setOption('useSoftTabs', true);
|
||||
editor.session.setOption('tabSize', 2);
|
||||
|
||||
const saveButton = editModalElem.querySelector(".save-button");
|
||||
const saveEditor = () => {
|
||||
fetch(`${relative_url}edit?configuration=${configuration}`, {
|
||||
credentials: "same-origin",
|
||||
method: "POST",
|
||||
body: editor.getValue()
|
||||
}).then(res => res.text()).then(() => {
|
||||
M.toast({
|
||||
html: `Saved <code class="inlinecode">${configuration}</code>`
|
||||
});
|
||||
});
|
||||
};
|
||||
|
||||
editor.commands.addCommand({
|
||||
name: 'saveCommand',
|
||||
bindKey: {win: 'Ctrl-S', mac: 'Command-S'},
|
||||
exec: saveEditor,
|
||||
readOnly: false
|
||||
});
|
||||
|
||||
saveButton.addEventListener('click', saveEditor);
|
||||
|
||||
document.querySelectorAll(".action-edit").forEach((btn) => {
|
||||
btn.addEventListener('click', (e) => {
|
||||
configuration = e.target.getAttribute('data-node');
|
||||
const modalInstance = M.Modal.getInstance(editModalElem);
|
||||
const filenameField = editModalElem.querySelector('.filename');
|
||||
filenameField.innerHTML = configuration;
|
||||
|
||||
fetch(`${relative_url}edit?configuration=${configuration}`, {credentials: "same-origin"})
|
||||
.then(res => res.text()).then(response => {
|
||||
editor.setValue(response, -1);
|
||||
});
|
||||
|
||||
modalInstance.open();
|
||||
});
|
||||
});
|
||||
|
||||
const modalSetupElem = document.getElementById("modal-wizard");
|
||||
const setupWizardStart = document.getElementById('setup-wizard-start');
|
||||
const startWizard = () => {
|
||||
const modalInstance = M.Modal.getInstance(modalSetupElem);
|
||||
modalInstance.open();
|
||||
|
||||
modalInstance.options.onCloseStart = () => {
|
||||
|
||||
};
|
||||
|
||||
$('.stepper').activateStepper({
|
||||
linearStepsNavigation: false,
|
||||
autoFocusInput: true,
|
||||
autoFormCreation: true,
|
||||
showFeedbackLoader: true,
|
||||
parallel: false
|
||||
});
|
||||
};
|
||||
|
||||
const scrollToBottomOfElement = (element) => {
|
||||
var atBottom = false;
|
||||
if (element.scrollTop + 30 >= (element.scrollHeight - element.offsetHeight)) {
|
||||
atBottom = true;
|
||||
}
|
||||
|
||||
if (atBottom) {
|
||||
element.scrollTop = element.scrollHeight;
|
||||
}
|
||||
}
|
||||
|
||||
setupWizardStart.addEventListener('click', startWizard);
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user