68 Commits

Author SHA1 Message Date
dependabot[bot]
356cc632a3 Bump requests from 2.32.3 to 2.32.4
Bumps [requests](https://github.com/psf/requests) from 2.32.3 to 2.32.4.
- [Release notes](https://github.com/psf/requests/releases)
- [Changelog](https://github.com/psf/requests/blob/main/HISTORY.md)
- [Commits](https://github.com/psf/requests/compare/v2.32.3...v2.32.4)

---
updated-dependencies:
- dependency-name: requests
  dependency-version: 2.32.4
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-06-10 10:04:58 +00:00
Konstantin Köhring
6c70d7c75f Fix Python packaging for Python 3.12 (#79)
* Translate setup.py to pyproject.toml

* Introduce PEP518 by using Poetry as package manager
* Introduce PEP402-ish package structure
* Remove requirements.txt and setup.py
* Update Dockerfile and Docs for new packaging method

* Bump dependencies, remove six

six is no longer necessary as of docker-py 5.0.1 (see https://github.com/docker/docker-py/pull/2863)
2024-11-13 12:17:10 -05:00
Konstantin Köhring
9ac4048783 Escape $$ in labels and env (#77) 2024-11-12 13:55:33 -05:00
Red5d
d3aa07ee74 Remove old pipx install instructions
PyPi version is not updated. The docker image is automatically updated and is the current recommended usage method.
#74
2024-04-04 22:22:46 -04:00
Bitals
2e6a55fad6 Installation instructions and AUR (#68)
I added a bit of information about installing this tool to the system and about my AUR packages.

Also the PyPI version is outdated (1.0.1 from Jan 3 2016, https://pypi.org/project/docker-autocompose/#history), is it still supported?
2023-11-27 21:08:31 -05:00
Pedro Reyes
3ad0d3621e Use docker compose compatible boolean values (#66) 2023-07-29 21:36:36 -04:00
Red5d
e1c0e23ff2 Sort mountpoints 2023-07-28 23:58:04 -04:00
Chad H
b07928353c Filter containers via regex (#57)
Allows for dumping a subset of all containers without having to list them all explicitly.

`autocompose.py --all --filter "foo-[0-9]*"`
2023-01-11 19:28:05 -05:00
Konstantin Köhring
d07ce54dd0 Quote all string values to reduce probability of compose syntax errors (#51)
* Fix a bug where strings with special characters are not quoted

Before:
```
...
services:
  <service>:
    ....
    logging:
      options:
        tag: {{.ImageName}}|{{.Name}}|{{.ImageFullID}}|{{.FullID}}
        max-file: 3
    ...
```
and docker-compose up fails
after:
```
...
services:
  <service>:
    ....
    logging:
      options:
        tag: "{{.ImageName}}|{{.Name}}|{{.ImageFullID}}|{{.FullID}}"
        max-file: "3"
    ...
```
and docker-compose up works

* Remove no longer necessary workarounds
2022-11-14 01:07:13 -05:00
Juan Biondi
0aa4522143 Fix issue on index out of error (#54)
* Add gather info about all networks in the system.

* Only generate host networks if all info is required.

* Fix issue getting network information.

Co-authored-by: Juan Biondi <juan@payslip.com>
2022-11-07 15:00:39 -05:00
Juan Biondi
dea3d848e8 Add gather info about all networks in the system. (#53)
* Add gather info about all networks in the system.

* Only generate host networks if all info is required.

Co-authored-by: Juan Biondi <juan@payslip.com>
2022-11-02 16:39:39 -04:00
Red5d
85e398193f Update README with note on how to validate output. 2022-08-20 14:46:27 -04:00
Red5d
dfc73d0bbd Update compose version 3 to 3.6.
With somewhat-recent docker features and spec changes that have been included in the compose file generation for this script, specifying compose file version 3.6 (at least) is necessary for the output compose file to pass validation using "docker-compose config".
2022-08-20 14:39:29 -04:00
Red5d
1768a65fce Fixes the --all option, volumes of type 'bind' and read only option (#46)
(from @d-EScape)

One PR that includes my suggestions for #17 and some new ones:

The -all option would not work because every iteration of container_names could set the 'networks' and 'volumes' to None. Even if a previous container had a network. Later iterations could not add a network, because it was no longer a dict, resulting in an exception.

The code might need some cleaning up. I left some comments and old pieces (commented out) to explain to @Red5d what I did and why. Since I am new to this script and the docker-compose format i might have overlooked something. Please check.

Co-authored-by: d-EScape <8693608+d-EScape@users.noreply.github.com>
2022-08-20 14:36:47 -04:00
acdoussan
357fef9782 better handling for default networks (#42)
* better handling for default networks

* fix for issues/43

* check for none explictly rather than allowing truthy/falsy conversion

* update to read volume data from mounts, rather than host config binds

Co-authored-by: Adam Doussan <acdoussan@Adams-MacBook-Pro.local>
2022-08-14 17:54:43 -04:00
acdoussan
0c4ff4fb25 Fix volumes missing in generated compose file (#41)
* fix volume export based off of https://github.com/Red5d/docker-autocompose/issues/17#issuecomment-943041549

* remove unneeded space

* export volumes in addition to networks

* fix syntax error

* actuall fix syntax errors

Co-authored-by: Adam Doussan <acdoussan@Adams-MacBook-Pro.local>
2022-08-13 16:31:05 -04:00
Red5d
e19c4654af Merge pull request #38 from ostafen/master
Add -a/--all flag to list all containers
2022-04-09 22:12:19 -04:00
Stefano
d6dddedb3d Add -a/--all flag to list all containers 2022-04-09 09:30:14 +02:00
Red5d
0dfdac353f Merge pull request #36 from ostafen/master
Format properly date/datetime labels
2022-03-16 09:24:12 -04:00
Red5d
a8f00e0deb Merge pull request #37 from alexanderpetrenz/master
Adding Name Attribute to each Network
2022-03-16 08:45:53 -04:00
alexanderpetrenz
adf98bb062 Merge branch 'Red5d:master' into master 2022-03-11 09:26:21 +01:00
Alexander Petrenz
1af6b49233 added name attribute to every retrieved network 2022-03-11 08:59:55 +01:00
Alexander Petrenz
40aaf8e82c fixed network retrieval 2022-03-11 08:59:47 +01:00
Stefano
63810906f9 Format properly date/datetime labels 2022-03-10 20:45:35 +01:00
Red5d
e32c9d4275 Merge pull request #35 from ostafen/master
Fix ERROR: network must be a mapping, not an array.
2022-03-10 14:04:38 -05:00
Stefano
caa747b605 Fix ERROR: network must be a mapping, not an array. 2022-03-10 15:32:50 +01:00
Red5d
d783902265 Merge pull request #34 from alexanderpetrenz/master
command property: replace string concatenation by taking over given list
2022-03-09 08:10:19 -05:00
Alexander Petrenz
e6badd31c3 now collecting networks not present in every container 2022-03-08 15:49:49 +01:00
Alexander Petrenz
3f756235b2 command property: replace string concatenation by taking over given list 2022-03-08 14:33:33 +01:00
Red5d
b9c096dd94 Merge pull request #33 from moschlar/patch-1
Update autocompose.py
2022-03-07 14:50:04 -05:00
Moritz Schlarb
e7dbe41f23 Update autocompose.py
Print error message to stderr so that I will be seen when redirecting stdout to `docker-compose.yml`
2022-03-07 11:56:06 +01:00
Red5d
d976d520b4 Merge pull request #32 from hgghyxo/patch-1
Update README.md
2022-02-24 11:14:25 -05:00
hgghyxo
ec211717ed Update README.md
adding a oneliner to print out all containers
2022-02-24 11:49:46 +01:00
Red5d
4e8ff192dc Updates about ARM support, dependency listing, and formatting/wording 2021-08-07 14:06:52 -04:00
Red5d
8ada367b9e Enable generating compose files for containers that aren't running. 2021-08-07 13:51:02 -04:00
Red5d
94ca597f3d Merge pull request #23 from LunaticMuch/fix/six-module
fix: six module missing
Not sure why this wasn't necessary before, but it definitely needs it now.
2021-08-07 13:40:14 -04:00
Red5d
0ed2f306cc Workaround for buildx not liking the capital letter in my username 2021-08-07 13:27:21 -04:00
Red5d
6415678751 Merge pull request #18 from doob187/master
I've added the secret key to the repo. Sorry this has taken a while. Life's been really busy and I forgot about this one.
2021-08-07 13:06:49 -04:00
doob187
4b5e27cf29 Merge branch 'Red5d:master' into master 2021-08-07 08:31:16 +02:00
LunaticMuch
d90e2d5389 fix: six module missing 2021-08-02 09:32:23 +01:00
Red5d
881b7979d5 Fix malformed 'devices' values 2021-07-28 23:43:07 -04:00
Red5d
a1f2aabdee Fix lines that depend on the 'networks' key being in values list 2021-07-28 23:19:19 -04:00
Red5d
d8e5aacf20 Exclude the 'networks' key if only the default bridge network is present 2021-07-28 23:13:17 -04:00
Red5d
0556a35376 Remove fields that are invalid in the compose v3 spec 2021-07-28 23:03:54 -04:00
doob187
046f0e9da7 Update Dockerfile 2021-05-29 08:11:47 +02:00
doob187
9c3e01d167 Create docker.yml 2021-05-29 08:04:46 +02:00
Red5d
9f3960defd Merge pull request #14 from akshaysalunke13/master
Fix yaml output to reflect latest docker-compose reference.
2021-03-15 10:28:46 -04:00
Akshay
98704a81b6 Fix 'networks' in yaml
fix opts following official compose reference

[docker-compose reference](https://docs.docker.com/compose/compose-file/compose-file-v3/)
2021-03-14 21:05:41 +11:00
Akshay Salunke
a3ff6534ab ignore ds_store file for mac 2021-03-14 17:50:13 +11:00
akshay-scalegrowth
555f1f90b3 Added requirements.txt for required packages 2021-03-14 16:29:26 +11:00
akshay-scalegrowth
09f14bceca Added gitignore 2021-03-14 16:29:06 +11:00
Red5d
e0f6c83bc4 Merge pull request #13 from rootrider/master
Update README.md
2020-10-22 19:20:42 -04:00
Joel
7fe7a07fb5 Update README.md
correct the link to the newer docker python module
2020-10-22 14:54:00 -07:00
Red5d
4eb8d97536 Fixed typo in compose file version setting example. 2018-12-20 21:04:40 -05:00
Red5d
b78da97768 Add additional networks stanza at the bottom of the output to support the networks attached to the container(s). 2018-08-24 19:44:24 -04:00
Red5d
e5ac520ff6 Support networks and network aliases. 2018-08-24 17:49:55 -04:00
Red5d
9827c4488d Updated README with new feature and usage information. 2018-07-27 20:44:14 -04:00
Red5d
800b088cea Update setup script to use new docker module, bump minimum module versions, and overall app version. 2018-07-27 20:21:35 -04:00
Red5d
9e0fa327ee Update script to use new module version, output compose file version 3, and be able to generate a combined file for multiple containers. 2018-07-27 20:20:41 -04:00
Red5d
f22f154fe9 Merge pull request #6 from smokes2345/patch-1
Update autocompose.py
2018-07-15 12:22:01 -04:00
Matt Davidson
bdb38eecac Update autocompose.py
Trying to iterate on cinspect['HostConfig']['PortBindings'] when it is equal to 'None' produces a TypeError
2018-07-15 11:13:38 -04:00
Red5d
f3f2eca906 Adjust format of example for clarity 2018-06-27 23:16:48 -04:00
Red5d
05bab6fda0 Merge pull request #4 from docwhat/patch-1
Update README.md
2018-03-20 14:34:57 -04:00
Christian Höltje
2a1b25e4f4 Update README.md
fixed image name.
2018-03-20 11:50:43 -04:00
Red5d
40ec3f9b42 Update image tag name. 2018-02-12 14:34:25 -05:00
Red5d
858b187e75 Merge pull request #2 from mogensen/master
Adding Dockerfile to run docker-autocompose as a container
2017-09-12 09:15:16 -04:00
Frederik Mogensen
b71078f68b Updated Readme 2017-09-12 12:41:05 +02:00
Frederik Mogensen
83bc0afad6 Docker file for running autocompose in docker 2017-09-12 12:15:46 +02:00
10 changed files with 979 additions and 128 deletions

77
.github/workflows/docker.yml vendored Normal file
View File

@@ -0,0 +1,77 @@
name: Build Multi Stage Docker Image
on:
push:
branches-ignore:
- 'dependabot/**'
schedule:
- cron: '0 5 * * *'
jobs:
GHRC:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v2.3.4
- uses: FranzDiebold/github-env-vars-action@v2.3.0
- name: Print environment variables
run: |
echo "CI_REPOSITORY_NAME=$CI_REPOSITORY_NAME"
- name: Prepare GHRC.io
id: prep
run: |
REPO=$CI_REPOSITORY_NAME
OWNER="$(echo "${{ github.repository_owner }}" | tr '[:upper:]' '[:lower:]')"
DOCKER_IMAGE=${OWNER}/${REPO}
if [ "$CI_REF_NAME" == "master" ];then VERSION=latest;fi
if [ "$CI_REF_NAME" == "dev" ];then VERSION=mightly;fi
if [ "$CI_REF_NAME" == "dockserver" ];then VERSION=dockserver;fi
TAGS="${DOCKER_IMAGE}:${VERSION}"
echo ::set-output name=tags::${TAGS}
echo ::set-output name=title::${GITHUB_REPOSITORY}
echo ::set-output name=version::${VERSION}
echo ::set-output name=created::$(date -u +'%Y-%m-%dT%H:%M:%SZ')
- name: Set up QEMU
uses: docker/setup-qemu-action@v1
with:
platforms: linux/amd64,linux/armhf,linux/arm64
- name: Set up Docker Buildx
id: buildx
uses: docker/setup-buildx-action@v1.3.0
- name: Cache Docker layers
uses: actions/cache@v2.1.6
with:
path: /tmp/.buildx-cache
key: ${{ runner.os }}-buildx-${{ github.sha }}
restore-keys: |
${{ runner.os }}-buildx-
- name: Login to GitHub Container Registry
if: github.event_name != 'pull_request'
uses: docker/login-action@v1.9.0
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.CR_PAT }}
- name: Build and push GHRC.io
id: docker_build
uses: docker/build-push-action@v2.5.0
with:
builder: ${{ steps.buildx.outputs.name }}
context: .
file: ./Dockerfile
platforms: linux/amd64,linux/armhf,linux/arm64
push: ${{ github.event_name != 'pull_request' }}
tags: ghcr.io/${{ steps.prep.outputs.tags }}
labels: |
org.opencontainers.image.title=${{ steps.prep.outputs.title }}
org.opencontainers.image.version=${{ steps.prep.outputs.version }}
org.opencontainers.image.created=${{ steps.prep.outputs.created }}
- name: Image digest
run: echo ${{ steps.docker_build.outputs.digest }}

160
.gitignore vendored Normal file
View File

@@ -0,0 +1,160 @@
# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
*$py.class
# C extensions
*.so
# Distribution / packaging
.Python
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
share/python-wheels/
*.egg-info/
.installed.cfg
*.egg
MANIFEST
# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec
# Installer logs
pip-log.txt
pip-delete-this-directory.txt
# Unit test / coverage reports
htmlcov/
.tox/
.nox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*.cover
*.py,cover
.hypothesis/
.pytest_cache/
cover/
# Translations
*.mo
*.pot
# Django stuff:
*.log
local_settings.py
db.sqlite3
db.sqlite3-journal
# Flask stuff:
instance/
.webassets-cache
# Scrapy stuff:
.scrapy
# Sphinx documentation
docs/_build/
# PyBuilder
.pybuilder/
target/
# Jupyter Notebook
.ipynb_checkpoints
# IPython
profile_default/
ipython_config.py
# pyenv
# For a library or package, you might want to ignore these files since the code is
# intended to run in multiple environments; otherwise, check them in:
# .python-version
# pipenv
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
# However, in case of collaboration, if having platform-specific dependencies or dependencies
# having no cross-platform support, pipenv may install dependencies that don't work, or not
# install all needed dependencies.
#Pipfile.lock
# poetry
# Similar to Pipfile.lock, it is generally recommended to include poetry.lock in version control.
# This is especially recommended for binary packages to ensure reproducibility, and is more
# commonly ignored for libraries.
# https://python-poetry.org/docs/basic-usage/#commit-your-poetrylock-file-to-version-control
#poetry.lock
# pdm
# Similar to Pipfile.lock, it is generally recommended to include pdm.lock in version control.
#pdm.lock
# pdm stores project-wide configurations in .pdm.toml, but it is recommended to not include it
# in version control.
# https://pdm.fming.dev/#use-with-ide
.pdm.toml
# PEP 582; used by e.g. github.com/David-OConnor/pyflow and github.com/pdm-project/pdm
__pypackages__/
# Celery stuff
celerybeat-schedule
celerybeat.pid
# SageMath parsed files
*.sage.py
# Environments
.env
.venv
env/
venv/
ENV/
env.bak/
venv.bak/
# Spyder project settings
.spyderproject
.spyproject
# Rope project settings
.ropeproject
# mkdocs documentation
/site
# mypy
.mypy_cache/
.dmypy.json
dmypy.json
# Pyre type checker
.pyre/
# pytype static type analyzer
.pytype/
# Cython debug symbols
cython_debug/
# PyCharm
# JetBrains specific template is maintained in a separate JetBrains.gitignore that can
# be found at https://github.com/github/gitignore/blob/main/Global/JetBrains.gitignore
# and can be added to the global gitignore or merged into this file. For a more nuclear
# option (not recommended) you can uncomment the following to ignore the entire idea folder.
#.idea/

11
Dockerfile Normal file
View File

@@ -0,0 +1,11 @@
FROM python:3.12-alpine
LABEL org.opencontainers.image.source=https://github.com/Red5d/docker-autocompose
WORKDIR /usr/src/app
ENTRYPOINT [ "poetry", "run", "autocompose" ]
RUN apk add --no-cache poetry
COPY poetry.lock pyproject.toml README.md ./
COPY src ./src
RUN poetry install

View File

@@ -1,18 +1,67 @@
# docker-autocompose
Generates a docker-compose yaml definition from a running container.
Generates a docker-compose yaml definition from a docker container.
Required Modules:
* [pyaml](https://pypi.python.org/pypi/pyaml/)
* [docker-py](https://pypi.python.org/pypi/docker-py)
* [pyaml](https://pypi.python.org/project/pyaml/)
* [docker](https://pypi.python.org/project/docker)
For building this project [poetry](https://python-poetry.org/) is required. Install it with the package manager of your OS or if that's impossible with `pip`.
Install them:
poetry install
Example Usage:
sudo python autocompose.py container-name
poetry run autocompose <container ids>
Generate a compose file for multiple containers together:
poetry run autocompose apache-test mysql-test
The script defaults to outputting to compose file version 3, but use "-v 1" to output to version 1:
poetry run autocompose -v 1 apache-test
Outputs a docker-compose compatible yaml structure:
Outputs a docker-compose compatible yaml structure.
[docker-compose reference](https://docs.docker.com/compose/)
[docker-compose yaml file specification](https://docs.docker.com/compose/compose-file/)
While experimenting with various docker containers from the Hub, I realized that I'd started several containers with complex options for volumes, ports, environment variables, etc. and there was no way I could remember all those commands without referencing the Hub page for each image if I needed to delete and re-create the container (for updates, or if something broke).
With this tool, I can easily generate docker-compose files for managing the containers that I've set up manually.
## Native installation
System-wide installation is discouraged. If you really need to, you can run `pip install --user --break-system-packages .` (use at your own discretion).
There are unofficial packages available in the Arch User Repository:
* [Stable](https://aur.archlinux.org/packages/docker-autocompose)
* [Development (follows the master branch)](https://aur.archlinux.org/packages/docker-autocompose-git)
**AUR packages are provided by a third party and are not tested or updated by the maintainer(s) of the docker-autocompose project.**
## Docker Usage
You can use this tool from a docker container by either cloning this repo and building the image or using the [automatically generated image on GitHub](https://github.com/Red5d/docker-autocompose/pkgs/container/docker-autocompose)
Pull the image from GitHub (supports both x86 and ARM)
docker pull ghcr.io/red5d/docker-autocompose:latest
Use the new image to generate a docker-compose file from a running container or a list of space-separated container names or ids:
docker run --rm -v /var/run/docker.sock:/var/run/docker.sock ghcr.io/red5d/docker-autocompose <container-name-or-id> <additional-names-or-ids>...
To print out all containers in a docker-compose format:
docker run --rm -v /var/run/docker.sock:/var/run/docker.sock ghcr.io/red5d/docker-autocompose $(docker ps -aq)
## Contributing
When making changes, please validate the output from the script by writing it to a file (docker-compose.yml or docker-compose.yaml) and running "docker-compose config" in the same folder with it to ensure that the resulting compose file will be accepted by docker-compose.

View File

@@ -1,105 +0,0 @@
#! /usr/bin/env python
import pyaml, argparse, sys
from docker import Client
def main():
parser = argparse.ArgumentParser(description='Generate docker-compose yaml definition from running container.')
parser.add_argument('cname', type=str, help='The name of the container to process.')
args = parser.parse_args()
generate(args)
def generate(args):
c = Client(base_url='unix://var/run/docker.sock')
try:
cid = [x['Id'] for x in c.containers() if args.cname in x['Names'][0]][0]
except IndexError:
print("That container is not running.")
sys.exit(1)
cinspect = c.inspect_container(cid)
# Build yaml dict structure
cfile = {}
cfile[args.cname] = {}
ct = cfile[args.cname]
values = {
'cap_add': cinspect['HostConfig']['CapAdd'],
'cap_drop': cinspect['HostConfig']['CapDrop'],
'cgroup_parent': cinspect['HostConfig']['CgroupParent'],
'container_name': args.cname,
'devices': cinspect['HostConfig']['Devices'],
'dns': cinspect['HostConfig']['Dns'],
'dns_search': cinspect['HostConfig']['DnsSearch'],
'environment': cinspect['Config']['Env'],
'extra_hosts': cinspect['HostConfig']['ExtraHosts'],
'image': cinspect['Config']['Image'],
'labels': cinspect['Config']['Labels'],
'links': cinspect['HostConfig']['Links'],
'log_driver': cinspect['HostConfig']['LogConfig']['Type'],
'log_opt': cinspect['HostConfig']['LogConfig']['Config'],
'net': cinspect['HostConfig']['NetworkMode'],
'security_opt': cinspect['HostConfig']['SecurityOpt'],
'ulimits': cinspect['HostConfig']['Ulimits'],
'volumes': cinspect['HostConfig']['Binds'],
'volume_driver': cinspect['HostConfig']['VolumeDriver'],
'volumes_from': cinspect['HostConfig']['VolumesFrom'],
'cpu_shares': cinspect['HostConfig']['CpuShares'],
'cpuset': cinspect['HostConfig']['CpusetCpus']+','+cinspect['HostConfig']['CpusetMems'],
'entrypoint': cinspect['Config']['Entrypoint'],
'user': cinspect['Config']['User'],
'working_dir': cinspect['Config']['WorkingDir'],
'domainname': cinspect['Config']['Domainname'],
'hostname': cinspect['Config']['Hostname'],
'ipc': cinspect['HostConfig']['IpcMode'],
'mac_address': cinspect['NetworkSettings']['MacAddress'],
'mem_limit': cinspect['HostConfig']['Memory'],
'memswap_limit': cinspect['HostConfig']['MemorySwap'],
'privileged': cinspect['HostConfig']['Privileged'],
'restart': cinspect['HostConfig']['RestartPolicy']['Name'],
'read_only': cinspect['HostConfig']['ReadonlyRootfs'],
'stdin_open': cinspect['Config']['OpenStdin'],
'tty': cinspect['Config']['Tty']
}
# Check for command and add it if present.
if cinspect['Config']['Cmd'] != None:
values['command'] = " ".join(cinspect['Config']['Cmd']),
# Check for exposed/bound ports and add them if needed.
try:
expose_value = list(cinspect['Config']['ExposedPorts'].keys())
ports_value = [cinspect['HostConfig']['PortBindings'][key][0]['HostIp']+':'+cinspect['HostConfig']['PortBindings'][key][0]['HostPort']+':'+key for key in cinspect['HostConfig']['PortBindings']]
# If bound ports found, don't use the 'expose' value.
if (ports_value != None) and (ports_value != "") and (ports_value != []) and (ports_value != 'null') and (ports_value != {}) and (ports_value != "default") and (ports_value != 0) and (ports_value != ",") and (ports_value != "no"):
for index, port in enumerate(ports_value):
if port[0] == ':':
ports_value[index] = port[1:]
values['ports'] = ports_value
else:
values['expose'] = expose_value
except KeyError:
# No ports exposed/bound. Continue without them.
ports = None
# Iterate through values to finish building yaml dict.
for key in values:
value = values[key]
if (value != None) and (value != "") and (value != []) and (value != 'null') and (value != {}) and (value != "default") and (value != 0) and (value != ",") and (value != "no"):
ct[key] = value
# Render yaml file
pyaml.p(cfile)
if __name__ == "__main__":
main()

321
poetry.lock generated Normal file
View File

@@ -0,0 +1,321 @@
# This file is automatically @generated by Poetry 2.1.1 and should not be changed by hand.
[[package]]
name = "certifi"
version = "2024.8.30"
description = "Python package for providing Mozilla's CA Bundle."
optional = false
python-versions = ">=3.6"
groups = ["main"]
files = [
{file = "certifi-2024.8.30-py3-none-any.whl", hash = "sha256:922820b53db7a7257ffbda3f597266d435245903d80737e34f8a45ff3e3230d8"},
{file = "certifi-2024.8.30.tar.gz", hash = "sha256:bec941d2aa8195e248a60b31ff9f0558284cf01a52591ceda73ea9afffd69fd9"},
]
[[package]]
name = "charset-normalizer"
version = "3.4.0"
description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet."
optional = false
python-versions = ">=3.7.0"
groups = ["main"]
files = [
{file = "charset_normalizer-3.4.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:4f9fc98dad6c2eaa32fc3af1417d95b5e3d08aff968df0cd320066def971f9a6"},
{file = "charset_normalizer-3.4.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:0de7b687289d3c1b3e8660d0741874abe7888100efe14bd0f9fd7141bcbda92b"},
{file = "charset_normalizer-3.4.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:5ed2e36c3e9b4f21dd9422f6893dec0abf2cca553af509b10cd630f878d3eb99"},
{file = "charset_normalizer-3.4.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:40d3ff7fc90b98c637bda91c89d51264a3dcf210cade3a2c6f838c7268d7a4ca"},
{file = "charset_normalizer-3.4.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1110e22af8ca26b90bd6364fe4c763329b0ebf1ee213ba32b68c73de5752323d"},
{file = "charset_normalizer-3.4.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:86f4e8cca779080f66ff4f191a685ced73d2f72d50216f7112185dc02b90b9b7"},
{file = "charset_normalizer-3.4.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7f683ddc7eedd742e2889d2bfb96d69573fde1d92fcb811979cdb7165bb9c7d3"},
{file = "charset_normalizer-3.4.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:27623ba66c183eca01bf9ff833875b459cad267aeeb044477fedac35e19ba907"},
{file = "charset_normalizer-3.4.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:f606a1881d2663630ea5b8ce2efe2111740df4b687bd78b34a8131baa007f79b"},
{file = "charset_normalizer-3.4.0-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:0b309d1747110feb25d7ed6b01afdec269c647d382c857ef4663bbe6ad95a912"},
{file = "charset_normalizer-3.4.0-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:136815f06a3ae311fae551c3df1f998a1ebd01ddd424aa5603a4336997629e95"},
{file = "charset_normalizer-3.4.0-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:14215b71a762336254351b00ec720a8e85cada43b987da5a042e4ce3e82bd68e"},
{file = "charset_normalizer-3.4.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:79983512b108e4a164b9c8d34de3992f76d48cadc9554c9e60b43f308988aabe"},
{file = "charset_normalizer-3.4.0-cp310-cp310-win32.whl", hash = "sha256:c94057af19bc953643a33581844649a7fdab902624d2eb739738a30e2b3e60fc"},
{file = "charset_normalizer-3.4.0-cp310-cp310-win_amd64.whl", hash = "sha256:55f56e2ebd4e3bc50442fbc0888c9d8c94e4e06a933804e2af3e89e2f9c1c749"},
{file = "charset_normalizer-3.4.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:0d99dd8ff461990f12d6e42c7347fd9ab2532fb70e9621ba520f9e8637161d7c"},
{file = "charset_normalizer-3.4.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:c57516e58fd17d03ebe67e181a4e4e2ccab1168f8c2976c6a334d4f819fe5944"},
{file = "charset_normalizer-3.4.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:6dba5d19c4dfab08e58d5b36304b3f92f3bd5d42c1a3fa37b5ba5cdf6dfcbcee"},
{file = "charset_normalizer-3.4.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bf4475b82be41b07cc5e5ff94810e6a01f276e37c2d55571e3fe175e467a1a1c"},
{file = "charset_normalizer-3.4.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ce031db0408e487fd2775d745ce30a7cd2923667cf3b69d48d219f1d8f5ddeb6"},
{file = "charset_normalizer-3.4.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8ff4e7cdfdb1ab5698e675ca622e72d58a6fa2a8aa58195de0c0061288e6e3ea"},
{file = "charset_normalizer-3.4.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3710a9751938947e6327ea9f3ea6332a09bf0ba0c09cae9cb1f250bd1f1549bc"},
{file = "charset_normalizer-3.4.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:82357d85de703176b5587dbe6ade8ff67f9f69a41c0733cf2425378b49954de5"},
{file = "charset_normalizer-3.4.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:47334db71978b23ebcf3c0f9f5ee98b8d65992b65c9c4f2d34c2eaf5bcaf0594"},
{file = "charset_normalizer-3.4.0-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:8ce7fd6767a1cc5a92a639b391891bf1c268b03ec7e021c7d6d902285259685c"},
{file = "charset_normalizer-3.4.0-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:f1a2f519ae173b5b6a2c9d5fa3116ce16e48b3462c8b96dfdded11055e3d6365"},
{file = "charset_normalizer-3.4.0-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:63bc5c4ae26e4bc6be6469943b8253c0fd4e4186c43ad46e713ea61a0ba49129"},
{file = "charset_normalizer-3.4.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:bcb4f8ea87d03bc51ad04add8ceaf9b0f085ac045ab4d74e73bbc2dc033f0236"},
{file = "charset_normalizer-3.4.0-cp311-cp311-win32.whl", hash = "sha256:9ae4ef0b3f6b41bad6366fb0ea4fc1d7ed051528e113a60fa2a65a9abb5b1d99"},
{file = "charset_normalizer-3.4.0-cp311-cp311-win_amd64.whl", hash = "sha256:cee4373f4d3ad28f1ab6290684d8e2ebdb9e7a1b74fdc39e4c211995f77bec27"},
{file = "charset_normalizer-3.4.0-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:0713f3adb9d03d49d365b70b84775d0a0d18e4ab08d12bc46baa6132ba78aaf6"},
{file = "charset_normalizer-3.4.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:de7376c29d95d6719048c194a9cf1a1b0393fbe8488a22008610b0361d834ecf"},
{file = "charset_normalizer-3.4.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:4a51b48f42d9358460b78725283f04bddaf44a9358197b889657deba38f329db"},
{file = "charset_normalizer-3.4.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b295729485b06c1a0683af02a9e42d2caa9db04a373dc38a6a58cdd1e8abddf1"},
{file = "charset_normalizer-3.4.0-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ee803480535c44e7f5ad00788526da7d85525cfefaf8acf8ab9a310000be4b03"},
{file = "charset_normalizer-3.4.0-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3d59d125ffbd6d552765510e3f31ed75ebac2c7470c7274195b9161a32350284"},
{file = "charset_normalizer-3.4.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8cda06946eac330cbe6598f77bb54e690b4ca93f593dee1568ad22b04f347c15"},
{file = "charset_normalizer-3.4.0-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:07afec21bbbbf8a5cc3651aa96b980afe2526e7f048fdfb7f1014d84acc8b6d8"},
{file = "charset_normalizer-3.4.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:6b40e8d38afe634559e398cc32b1472f376a4099c75fe6299ae607e404c033b2"},
{file = "charset_normalizer-3.4.0-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:b8dcd239c743aa2f9c22ce674a145e0a25cb1566c495928440a181ca1ccf6719"},
{file = "charset_normalizer-3.4.0-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:84450ba661fb96e9fd67629b93d2941c871ca86fc38d835d19d4225ff946a631"},
{file = "charset_normalizer-3.4.0-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:44aeb140295a2f0659e113b31cfe92c9061622cadbc9e2a2f7b8ef6b1e29ef4b"},
{file = "charset_normalizer-3.4.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:1db4e7fefefd0f548d73e2e2e041f9df5c59e178b4c72fbac4cc6f535cfb1565"},
{file = "charset_normalizer-3.4.0-cp312-cp312-win32.whl", hash = "sha256:5726cf76c982532c1863fb64d8c6dd0e4c90b6ece9feb06c9f202417a31f7dd7"},
{file = "charset_normalizer-3.4.0-cp312-cp312-win_amd64.whl", hash = "sha256:b197e7094f232959f8f20541ead1d9862ac5ebea1d58e9849c1bf979255dfac9"},
{file = "charset_normalizer-3.4.0-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:dd4eda173a9fcccb5f2e2bd2a9f423d180194b1bf17cf59e3269899235b2a114"},
{file = "charset_normalizer-3.4.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:e9e3c4c9e1ed40ea53acf11e2a386383c3304212c965773704e4603d589343ed"},
{file = "charset_normalizer-3.4.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:92a7e36b000bf022ef3dbb9c46bfe2d52c047d5e3f3343f43204263c5addc250"},
{file = "charset_normalizer-3.4.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:54b6a92d009cbe2fb11054ba694bc9e284dad30a26757b1e372a1fdddaf21920"},
{file = "charset_normalizer-3.4.0-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ffd9493de4c922f2a38c2bf62b831dcec90ac673ed1ca182fe11b4d8e9f2a64"},
{file = "charset_normalizer-3.4.0-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:35c404d74c2926d0287fbd63ed5d27eb911eb9e4a3bb2c6d294f3cfd4a9e0c23"},
{file = "charset_normalizer-3.4.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4796efc4faf6b53a18e3d46343535caed491776a22af773f366534056c4e1fbc"},
{file = "charset_normalizer-3.4.0-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e7fdd52961feb4c96507aa649550ec2a0d527c086d284749b2f582f2d40a2e0d"},
{file = "charset_normalizer-3.4.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:92db3c28b5b2a273346bebb24857fda45601aef6ae1c011c0a997106581e8a88"},
{file = "charset_normalizer-3.4.0-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:ab973df98fc99ab39080bfb0eb3a925181454d7c3ac8a1e695fddfae696d9e90"},
{file = "charset_normalizer-3.4.0-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:4b67fdab07fdd3c10bb21edab3cbfe8cf5696f453afce75d815d9d7223fbe88b"},
{file = "charset_normalizer-3.4.0-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:aa41e526a5d4a9dfcfbab0716c7e8a1b215abd3f3df5a45cf18a12721d31cb5d"},
{file = "charset_normalizer-3.4.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:ffc519621dce0c767e96b9c53f09c5d215578e10b02c285809f76509a3931482"},
{file = "charset_normalizer-3.4.0-cp313-cp313-win32.whl", hash = "sha256:f19c1585933c82098c2a520f8ec1227f20e339e33aca8fa6f956f6691b784e67"},
{file = "charset_normalizer-3.4.0-cp313-cp313-win_amd64.whl", hash = "sha256:707b82d19e65c9bd28b81dde95249b07bf9f5b90ebe1ef17d9b57473f8a64b7b"},
{file = "charset_normalizer-3.4.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:dbe03226baf438ac4fda9e2d0715022fd579cb641c4cf639fa40d53b2fe6f3e2"},
{file = "charset_normalizer-3.4.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dd9a8bd8900e65504a305bf8ae6fa9fbc66de94178c420791d0293702fce2df7"},
{file = "charset_normalizer-3.4.0-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b8831399554b92b72af5932cdbbd4ddc55c55f631bb13ff8fe4e6536a06c5c51"},
{file = "charset_normalizer-3.4.0-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a14969b8691f7998e74663b77b4c36c0337cb1df552da83d5c9004a93afdb574"},
{file = "charset_normalizer-3.4.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:dcaf7c1524c0542ee2fc82cc8ec337f7a9f7edee2532421ab200d2b920fc97cf"},
{file = "charset_normalizer-3.4.0-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:425c5f215d0eecee9a56cdb703203dda90423247421bf0d67125add85d0c4455"},
{file = "charset_normalizer-3.4.0-cp37-cp37m-musllinux_1_2_aarch64.whl", hash = "sha256:d5b054862739d276e09928de37c79ddeec42a6e1bfc55863be96a36ba22926f6"},
{file = "charset_normalizer-3.4.0-cp37-cp37m-musllinux_1_2_i686.whl", hash = "sha256:f3e73a4255342d4eb26ef6df01e3962e73aa29baa3124a8e824c5d3364a65748"},
{file = "charset_normalizer-3.4.0-cp37-cp37m-musllinux_1_2_ppc64le.whl", hash = "sha256:2f6c34da58ea9c1a9515621f4d9ac379871a8f21168ba1b5e09d74250de5ad62"},
{file = "charset_normalizer-3.4.0-cp37-cp37m-musllinux_1_2_s390x.whl", hash = "sha256:f09cb5a7bbe1ecae6e87901a2eb23e0256bb524a79ccc53eb0b7629fbe7677c4"},
{file = "charset_normalizer-3.4.0-cp37-cp37m-musllinux_1_2_x86_64.whl", hash = "sha256:0099d79bdfcf5c1f0c2c72f91516702ebf8b0b8ddd8905f97a8aecf49712c621"},
{file = "charset_normalizer-3.4.0-cp37-cp37m-win32.whl", hash = "sha256:9c98230f5042f4945f957d006edccc2af1e03ed5e37ce7c373f00a5a4daa6149"},
{file = "charset_normalizer-3.4.0-cp37-cp37m-win_amd64.whl", hash = "sha256:62f60aebecfc7f4b82e3f639a7d1433a20ec32824db2199a11ad4f5e146ef5ee"},
{file = "charset_normalizer-3.4.0-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:af73657b7a68211996527dbfeffbb0864e043d270580c5aef06dc4b659a4b578"},
{file = "charset_normalizer-3.4.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:cab5d0b79d987c67f3b9e9c53f54a61360422a5a0bc075f43cab5621d530c3b6"},
{file = "charset_normalizer-3.4.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:9289fd5dddcf57bab41d044f1756550f9e7cf0c8e373b8cdf0ce8773dc4bd417"},
{file = "charset_normalizer-3.4.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6b493a043635eb376e50eedf7818f2f322eabbaa974e948bd8bdd29eb7ef2a51"},
{file = "charset_normalizer-3.4.0-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9fa2566ca27d67c86569e8c85297aaf413ffab85a8960500f12ea34ff98e4c41"},
{file = "charset_normalizer-3.4.0-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a8e538f46104c815be19c975572d74afb53f29650ea2025bbfaef359d2de2f7f"},
{file = "charset_normalizer-3.4.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6fd30dc99682dc2c603c2b315bded2799019cea829f8bf57dc6b61efde6611c8"},
{file = "charset_normalizer-3.4.0-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:2006769bd1640bdf4d5641c69a3d63b71b81445473cac5ded39740a226fa88ab"},
{file = "charset_normalizer-3.4.0-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:dc15e99b2d8a656f8e666854404f1ba54765871104e50c8e9813af8a7db07f12"},
{file = "charset_normalizer-3.4.0-cp38-cp38-musllinux_1_2_i686.whl", hash = "sha256:ab2e5bef076f5a235c3774b4f4028a680432cded7cad37bba0fd90d64b187d19"},
{file = "charset_normalizer-3.4.0-cp38-cp38-musllinux_1_2_ppc64le.whl", hash = "sha256:4ec9dd88a5b71abfc74e9df5ebe7921c35cbb3b641181a531ca65cdb5e8e4dea"},
{file = "charset_normalizer-3.4.0-cp38-cp38-musllinux_1_2_s390x.whl", hash = "sha256:43193c5cda5d612f247172016c4bb71251c784d7a4d9314677186a838ad34858"},
{file = "charset_normalizer-3.4.0-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:aa693779a8b50cd97570e5a0f343538a8dbd3e496fa5dcb87e29406ad0299654"},
{file = "charset_normalizer-3.4.0-cp38-cp38-win32.whl", hash = "sha256:7706f5850360ac01d80c89bcef1640683cc12ed87f42579dab6c5d3ed6888613"},
{file = "charset_normalizer-3.4.0-cp38-cp38-win_amd64.whl", hash = "sha256:c3e446d253bd88f6377260d07c895816ebf33ffffd56c1c792b13bff9c3e1ade"},
{file = "charset_normalizer-3.4.0-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:980b4f289d1d90ca5efcf07958d3eb38ed9c0b7676bf2831a54d4f66f9c27dfa"},
{file = "charset_normalizer-3.4.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:f28f891ccd15c514a0981f3b9db9aa23d62fe1a99997512b0491d2ed323d229a"},
{file = "charset_normalizer-3.4.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:a8aacce6e2e1edcb6ac625fb0f8c3a9570ccc7bfba1f63419b3769ccf6a00ed0"},
{file = "charset_normalizer-3.4.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bd7af3717683bea4c87acd8c0d3d5b44d56120b26fd3f8a692bdd2d5260c620a"},
{file = "charset_normalizer-3.4.0-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5ff2ed8194587faf56555927b3aa10e6fb69d931e33953943bc4f837dfee2242"},
{file = "charset_normalizer-3.4.0-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e91f541a85298cf35433bf66f3fab2a4a2cff05c127eeca4af174f6d497f0d4b"},
{file = "charset_normalizer-3.4.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:309a7de0a0ff3040acaebb35ec45d18db4b28232f21998851cfa709eeff49d62"},
{file = "charset_normalizer-3.4.0-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:285e96d9d53422efc0d7a17c60e59f37fbf3dfa942073f666db4ac71e8d726d0"},
{file = "charset_normalizer-3.4.0-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:5d447056e2ca60382d460a604b6302d8db69476fd2015c81e7c35417cfabe4cd"},
{file = "charset_normalizer-3.4.0-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:20587d20f557fe189b7947d8e7ec5afa110ccf72a3128d61a2a387c3313f46be"},
{file = "charset_normalizer-3.4.0-cp39-cp39-musllinux_1_2_ppc64le.whl", hash = "sha256:130272c698667a982a5d0e626851ceff662565379baf0ff2cc58067b81d4f11d"},
{file = "charset_normalizer-3.4.0-cp39-cp39-musllinux_1_2_s390x.whl", hash = "sha256:ab22fbd9765e6954bc0bcff24c25ff71dcbfdb185fcdaca49e81bac68fe724d3"},
{file = "charset_normalizer-3.4.0-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:7782afc9b6b42200f7362858f9e73b1f8316afb276d316336c0ec3bd73312742"},
{file = "charset_normalizer-3.4.0-cp39-cp39-win32.whl", hash = "sha256:2de62e8801ddfff069cd5c504ce3bc9672b23266597d4e4f50eda28846c322f2"},
{file = "charset_normalizer-3.4.0-cp39-cp39-win_amd64.whl", hash = "sha256:95c3c157765b031331dd4db3c775e58deaee050a3042fcad72cbc4189d7c8dca"},
{file = "charset_normalizer-3.4.0-py3-none-any.whl", hash = "sha256:fe9f97feb71aa9896b81973a7bbada8c49501dc73e58a10fcef6663af95e5079"},
{file = "charset_normalizer-3.4.0.tar.gz", hash = "sha256:223217c3d4f82c3ac5e29032b3f1c2eb0fb591b72161f86d93f5719079dae93e"},
]
[[package]]
name = "docker"
version = "7.1.0"
description = "A Python library for the Docker Engine API."
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "docker-7.1.0-py3-none-any.whl", hash = "sha256:c96b93b7f0a746f9e77d325bcfb87422a3d8bd4f03136ae8a85b37f1898d5fc0"},
{file = "docker-7.1.0.tar.gz", hash = "sha256:ad8c70e6e3f8926cb8a92619b832b4ea5299e2831c14284663184e200546fa6c"},
]
[package.dependencies]
pywin32 = {version = ">=304", markers = "sys_platform == \"win32\""}
requests = ">=2.26.0"
urllib3 = ">=1.26.0"
[package.extras]
dev = ["coverage (==7.2.7)", "pytest (==7.4.2)", "pytest-cov (==4.1.0)", "pytest-timeout (==2.1.0)", "ruff (==0.1.8)"]
docs = ["myst-parser (==0.18.0)", "sphinx (==5.1.1)"]
ssh = ["paramiko (>=2.4.3)"]
websockets = ["websocket-client (>=1.3.0)"]
[[package]]
name = "idna"
version = "3.10"
description = "Internationalized Domain Names in Applications (IDNA)"
optional = false
python-versions = ">=3.6"
groups = ["main"]
files = [
{file = "idna-3.10-py3-none-any.whl", hash = "sha256:946d195a0d259cbba61165e88e65941f16e9b36ea6ddb97f00452bae8b1287d3"},
{file = "idna-3.10.tar.gz", hash = "sha256:12f65c9b470abda6dc35cf8e63cc574b1c52b11df2c86030af0ac09b01b13ea9"},
]
[package.extras]
all = ["flake8 (>=7.1.1)", "mypy (>=1.11.2)", "pytest (>=8.3.2)", "ruff (>=0.6.2)"]
[[package]]
name = "pyaml"
version = "24.9.0"
description = "PyYAML-based module to produce a bit more pretty and readable YAML-serialized data"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "pyaml-24.9.0-py3-none-any.whl", hash = "sha256:31080551502f1014852b3c966a96c796adc79b4cf86e165f28ed83455bf19c62"},
{file = "pyaml-24.9.0.tar.gz", hash = "sha256:e78dee8b0d4fed56bb9fa11a8a7858e6fade1ec70a9a122cee6736efac3e69b5"},
]
[package.dependencies]
PyYAML = "*"
[package.extras]
anchors = ["unidecode"]
[[package]]
name = "pywin32"
version = "308"
description = "Python for Window Extensions"
optional = false
python-versions = "*"
groups = ["main"]
markers = "sys_platform == \"win32\""
files = [
{file = "pywin32-308-cp310-cp310-win32.whl", hash = "sha256:796ff4426437896550d2981b9c2ac0ffd75238ad9ea2d3bfa67a1abd546d262e"},
{file = "pywin32-308-cp310-cp310-win_amd64.whl", hash = "sha256:4fc888c59b3c0bef905ce7eb7e2106a07712015ea1c8234b703a088d46110e8e"},
{file = "pywin32-308-cp310-cp310-win_arm64.whl", hash = "sha256:a5ab5381813b40f264fa3495b98af850098f814a25a63589a8e9eb12560f450c"},
{file = "pywin32-308-cp311-cp311-win32.whl", hash = "sha256:5d8c8015b24a7d6855b1550d8e660d8daa09983c80e5daf89a273e5c6fb5095a"},
{file = "pywin32-308-cp311-cp311-win_amd64.whl", hash = "sha256:575621b90f0dc2695fec346b2d6302faebd4f0f45c05ea29404cefe35d89442b"},
{file = "pywin32-308-cp311-cp311-win_arm64.whl", hash = "sha256:100a5442b7332070983c4cd03f2e906a5648a5104b8a7f50175f7906efd16bb6"},
{file = "pywin32-308-cp312-cp312-win32.whl", hash = "sha256:587f3e19696f4bf96fde9d8a57cec74a57021ad5f204c9e627e15c33ff568897"},
{file = "pywin32-308-cp312-cp312-win_amd64.whl", hash = "sha256:00b3e11ef09ede56c6a43c71f2d31857cf7c54b0ab6e78ac659497abd2834f47"},
{file = "pywin32-308-cp312-cp312-win_arm64.whl", hash = "sha256:9b4de86c8d909aed15b7011182c8cab38c8850de36e6afb1f0db22b8959e3091"},
{file = "pywin32-308-cp313-cp313-win32.whl", hash = "sha256:1c44539a37a5b7b21d02ab34e6a4d314e0788f1690d65b48e9b0b89f31abbbed"},
{file = "pywin32-308-cp313-cp313-win_amd64.whl", hash = "sha256:fd380990e792eaf6827fcb7e187b2b4b1cede0585e3d0c9e84201ec27b9905e4"},
{file = "pywin32-308-cp313-cp313-win_arm64.whl", hash = "sha256:ef313c46d4c18dfb82a2431e3051ac8f112ccee1a34f29c263c583c568db63cd"},
{file = "pywin32-308-cp37-cp37m-win32.whl", hash = "sha256:1f696ab352a2ddd63bd07430080dd598e6369152ea13a25ebcdd2f503a38f1ff"},
{file = "pywin32-308-cp37-cp37m-win_amd64.whl", hash = "sha256:13dcb914ed4347019fbec6697a01a0aec61019c1046c2b905410d197856326a6"},
{file = "pywin32-308-cp38-cp38-win32.whl", hash = "sha256:5794e764ebcabf4ff08c555b31bd348c9025929371763b2183172ff4708152f0"},
{file = "pywin32-308-cp38-cp38-win_amd64.whl", hash = "sha256:3b92622e29d651c6b783e368ba7d6722b1634b8e70bd376fd7610fe1992e19de"},
{file = "pywin32-308-cp39-cp39-win32.whl", hash = "sha256:7873ca4dc60ab3287919881a7d4f88baee4a6e639aa6962de25a98ba6b193341"},
{file = "pywin32-308-cp39-cp39-win_amd64.whl", hash = "sha256:71b3322d949b4cc20776436a9c9ba0eeedcbc9c650daa536df63f0ff111bb920"},
]
[[package]]
name = "pyyaml"
version = "6.0.2"
description = "YAML parser and emitter for Python"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "PyYAML-6.0.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:0a9a2848a5b7feac301353437eb7d5957887edbf81d56e903999a75a3d743086"},
{file = "PyYAML-6.0.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:29717114e51c84ddfba879543fb232a6ed60086602313ca38cce623c1d62cfbf"},
{file = "PyYAML-6.0.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8824b5a04a04a047e72eea5cec3bc266db09e35de6bdfe34c9436ac5ee27d237"},
{file = "PyYAML-6.0.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:7c36280e6fb8385e520936c3cb3b8042851904eba0e58d277dca80a5cfed590b"},
{file = "PyYAML-6.0.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ec031d5d2feb36d1d1a24380e4db6d43695f3748343d99434e6f5f9156aaa2ed"},
{file = "PyYAML-6.0.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:936d68689298c36b53b29f23c6dbb74de12b4ac12ca6cfe0e047bedceea56180"},
{file = "PyYAML-6.0.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:23502f431948090f597378482b4812b0caae32c22213aecf3b55325e049a6c68"},
{file = "PyYAML-6.0.2-cp310-cp310-win32.whl", hash = "sha256:2e99c6826ffa974fe6e27cdb5ed0021786b03fc98e5ee3c5bfe1fd5015f42b99"},
{file = "PyYAML-6.0.2-cp310-cp310-win_amd64.whl", hash = "sha256:a4d3091415f010369ae4ed1fc6b79def9416358877534caf6a0fdd2146c87a3e"},
{file = "PyYAML-6.0.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:cc1c1159b3d456576af7a3e4d1ba7e6924cb39de8f67111c735f6fc832082774"},
{file = "PyYAML-6.0.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:1e2120ef853f59c7419231f3bf4e7021f1b936f6ebd222406c3b60212205d2ee"},
{file = "PyYAML-6.0.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5d225db5a45f21e78dd9358e58a98702a0302f2659a3c6cd320564b75b86f47c"},
{file = "PyYAML-6.0.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5ac9328ec4831237bec75defaf839f7d4564be1e6b25ac710bd1a96321cc8317"},
{file = "PyYAML-6.0.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3ad2a3decf9aaba3d29c8f537ac4b243e36bef957511b4766cb0057d32b0be85"},
{file = "PyYAML-6.0.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:ff3824dc5261f50c9b0dfb3be22b4567a6f938ccce4587b38952d85fd9e9afe4"},
{file = "PyYAML-6.0.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:797b4f722ffa07cc8d62053e4cff1486fa6dc094105d13fea7b1de7d8bf71c9e"},
{file = "PyYAML-6.0.2-cp311-cp311-win32.whl", hash = "sha256:11d8f3dd2b9c1207dcaf2ee0bbbfd5991f571186ec9cc78427ba5bd32afae4b5"},
{file = "PyYAML-6.0.2-cp311-cp311-win_amd64.whl", hash = "sha256:e10ce637b18caea04431ce14fabcf5c64a1c61ec9c56b071a4b7ca131ca52d44"},
{file = "PyYAML-6.0.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:c70c95198c015b85feafc136515252a261a84561b7b1d51e3384e0655ddf25ab"},
{file = "PyYAML-6.0.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:ce826d6ef20b1bc864f0a68340c8b3287705cae2f8b4b1d932177dcc76721725"},
{file = "PyYAML-6.0.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1f71ea527786de97d1a0cc0eacd1defc0985dcf6b3f17bb77dcfc8c34bec4dc5"},
{file = "PyYAML-6.0.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9b22676e8097e9e22e36d6b7bda33190d0d400f345f23d4065d48f4ca7ae0425"},
{file = "PyYAML-6.0.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:80bab7bfc629882493af4aa31a4cfa43a4c57c83813253626916b8c7ada83476"},
{file = "PyYAML-6.0.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:0833f8694549e586547b576dcfaba4a6b55b9e96098b36cdc7ebefe667dfed48"},
{file = "PyYAML-6.0.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8b9c7197f7cb2738065c481a0461e50ad02f18c78cd75775628afb4d7137fb3b"},
{file = "PyYAML-6.0.2-cp312-cp312-win32.whl", hash = "sha256:ef6107725bd54b262d6dedcc2af448a266975032bc85ef0172c5f059da6325b4"},
{file = "PyYAML-6.0.2-cp312-cp312-win_amd64.whl", hash = "sha256:7e7401d0de89a9a855c839bc697c079a4af81cf878373abd7dc625847d25cbd8"},
{file = "PyYAML-6.0.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:efdca5630322a10774e8e98e1af481aad470dd62c3170801852d752aa7a783ba"},
{file = "PyYAML-6.0.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:50187695423ffe49e2deacb8cd10510bc361faac997de9efef88badc3bb9e2d1"},
{file = "PyYAML-6.0.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0ffe8360bab4910ef1b9e87fb812d8bc0a308b0d0eef8c8f44e0254ab3b07133"},
{file = "PyYAML-6.0.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:17e311b6c678207928d649faa7cb0d7b4c26a0ba73d41e99c4fff6b6c3276484"},
{file = "PyYAML-6.0.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:70b189594dbe54f75ab3a1acec5f1e3faa7e8cf2f1e08d9b561cb41b845f69d5"},
{file = "PyYAML-6.0.2-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:41e4e3953a79407c794916fa277a82531dd93aad34e29c2a514c2c0c5fe971cc"},
{file = "PyYAML-6.0.2-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:68ccc6023a3400877818152ad9a1033e3db8625d899c72eacb5a668902e4d652"},
{file = "PyYAML-6.0.2-cp313-cp313-win32.whl", hash = "sha256:bc2fa7c6b47d6bc618dd7fb02ef6fdedb1090ec036abab80d4681424b84c1183"},
{file = "PyYAML-6.0.2-cp313-cp313-win_amd64.whl", hash = "sha256:8388ee1976c416731879ac16da0aff3f63b286ffdd57cdeb95f3f2e085687563"},
{file = "PyYAML-6.0.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:24471b829b3bf607e04e88d79542a9d48bb037c2267d7927a874e6c205ca7e9a"},
{file = "PyYAML-6.0.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d7fded462629cfa4b685c5416b949ebad6cec74af5e2d42905d41e257e0869f5"},
{file = "PyYAML-6.0.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d84a1718ee396f54f3a086ea0a66d8e552b2ab2017ef8b420e92edbc841c352d"},
{file = "PyYAML-6.0.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9056c1ecd25795207ad294bcf39f2db3d845767be0ea6e6a34d856f006006083"},
{file = "PyYAML-6.0.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:82d09873e40955485746739bcb8b4586983670466c23382c19cffecbf1fd8706"},
{file = "PyYAML-6.0.2-cp38-cp38-win32.whl", hash = "sha256:43fa96a3ca0d6b1812e01ced1044a003533c47f6ee8aca31724f78e93ccc089a"},
{file = "PyYAML-6.0.2-cp38-cp38-win_amd64.whl", hash = "sha256:01179a4a8559ab5de078078f37e5c1a30d76bb88519906844fd7bdea1b7729ff"},
{file = "PyYAML-6.0.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:688ba32a1cffef67fd2e9398a2efebaea461578b0923624778664cc1c914db5d"},
{file = "PyYAML-6.0.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:a8786accb172bd8afb8be14490a16625cbc387036876ab6ba70912730faf8e1f"},
{file = "PyYAML-6.0.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d8e03406cac8513435335dbab54c0d385e4a49e4945d2909a581c83647ca0290"},
{file = "PyYAML-6.0.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f753120cb8181e736c57ef7636e83f31b9c0d1722c516f7e86cf15b7aa57ff12"},
{file = "PyYAML-6.0.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3b1fdb9dc17f5a7677423d508ab4f243a726dea51fa5e70992e59a7411c89d19"},
{file = "PyYAML-6.0.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:0b69e4ce7a131fe56b7e4d770c67429700908fc0752af059838b1cfb41960e4e"},
{file = "PyYAML-6.0.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:a9f8c2e67970f13b16084e04f134610fd1d374bf477b17ec1599185cf611d725"},
{file = "PyYAML-6.0.2-cp39-cp39-win32.whl", hash = "sha256:6395c297d42274772abc367baaa79683958044e5d3835486c16da75d2a694631"},
{file = "PyYAML-6.0.2-cp39-cp39-win_amd64.whl", hash = "sha256:39693e1f8320ae4f43943590b49779ffb98acb81f788220ea932a6b6c51004d8"},
{file = "pyyaml-6.0.2.tar.gz", hash = "sha256:d584d9ec91ad65861cc08d42e834324ef890a082e591037abe114850ff7bbc3e"},
]
[[package]]
name = "requests"
version = "2.32.4"
description = "Python HTTP for Humans."
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "requests-2.32.4-py3-none-any.whl", hash = "sha256:27babd3cda2a6d50b30443204ee89830707d396671944c998b5975b031ac2b2c"},
{file = "requests-2.32.4.tar.gz", hash = "sha256:27d0316682c8a29834d3264820024b62a36942083d52caf2f14c0591336d3422"},
]
[package.dependencies]
certifi = ">=2017.4.17"
charset_normalizer = ">=2,<4"
idna = ">=2.5,<4"
urllib3 = ">=1.21.1,<3"
[package.extras]
socks = ["PySocks (>=1.5.6,!=1.5.7)"]
use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
[[package]]
name = "urllib3"
version = "2.2.3"
description = "HTTP library with thread-safe connection pooling, file post, and more."
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "urllib3-2.2.3-py3-none-any.whl", hash = "sha256:ca899ca043dcb1bafa3e262d73aa25c465bfb49e0bd9dd5d59f1d0acba2f8fac"},
{file = "urllib3-2.2.3.tar.gz", hash = "sha256:e7d814a81dad81e6caf2ec9fdedb284ecc9c73076b62654547cc64ccdcae26e9"},
]
[package.extras]
brotli = ["brotli (>=1.0.9) ; platform_python_implementation == \"CPython\"", "brotlicffi (>=0.8.0) ; platform_python_implementation != \"CPython\""]
h2 = ["h2 (>=4,<5)"]
socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"]
zstd = ["zstandard (>=0.18.0)"]
[metadata]
lock-version = "2.1"
python-versions = ">=3.8"
content-hash = "9fcebd0faade00ce36ae6f306ab2b23521a080299c97e62cc409a5472de8d8f7"

41
pyproject.toml Normal file
View File

@@ -0,0 +1,41 @@
[tool.poetry]
name = "docker-autocompose"
version = "1.3.0"
description = "Generate a docker-compose yaml definition from a running container"
authors = ["Red5d"]
keywords = ["docker", "yaml", "container"]
license = "GPLv2"
classifiers = [
"Development Status :: 5 - Production/Stable",
"Environment :: Console",
"Intended Audience :: System Administrators",
"License :: OSI Approved :: GNU General Public License v2 (GPLv2)",
"Operating System :: POSIX :: Linux",
"Programming Language :: Python :: 3 :: Only",
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Topic :: Utilities"
]
homepage = "https://github.com/Red5d/docker-autocompose"
documentation = "https://github.com/Red5d/docker-autocompose/blob/master/README.md"
repository = "https://github.com/Red5d/docker-autocompose.git"
readme = "README.md"
packages = [
{ include = "src" }
]
[tool.poetry.dependencies]
# see https://python-poetry.org/docs/dependency-specification/ for version specifiers
python = ">=3.8"
pyaml = "~24.9.0"
docker = "~7.1.0"
[tool.poetry.scripts]
autocompose = "src.autocompose:main"
[build-system]
requires = ["poetry-core"]
build-backend = "poetry.core.masonry.api"

View File

@@ -1,18 +0,0 @@
from setuptools import setup, find_packages
setup(
name = "docker-autocompose",
version = "1.0.1",
description = "Generate a docker-compose yaml definition from a running container",
url = "https://github.com/Red5d/docker-autocompose",
author = "Red5d",
license = "GPLv2",
keywords = "docker yaml container",
packages = find_packages(),
install_requires = ['pyaml>=15.8.2', 'docker-py>=1.6.0'],
scripts = ['autocompose.py'],
entry_points={
'console_scripts': [
'autocompose = autocompose:main',
]
}
)

0
src/__init__.py Normal file
View File

315
src/autocompose.py Normal file
View File

@@ -0,0 +1,315 @@
#! /usr/bin/env python3
import argparse
import datetime
import re
import sys
from collections import OrderedDict
import docker
import pyaml
pyaml.add_representer(bool,lambda s,o: s.represent_scalar('tag:yaml.org,2002:bool',['false','true'][o]))
IGNORE_VALUES = [None, "", [], "null", {}, "default", 0, ",", "no"]
def shell_escape_string(input_string):
# Currently known issues:
# - Basic Auth strings (e.g. set via Træfik labels) contain $ characters, which must be doubled. See https://stackoverflow.com/a/40621373/5885325
replaced_string = input_string
for substitution in (
("$", "$$"),
):
replaced_string = replaced_string.replace(substitution[0], substitution[1])
return replaced_string
def list_container_names():
c = docker.from_env()
return [container.name for container in c.containers.list(all=True)]
def list_network_names():
c = docker.from_env()
return [network.name for network in c.networks.list()]
def generate_network_info():
networks = {}
for network_name in list_network_names():
connection = docker.from_env()
network_attributes = connection.networks.get(network_name).attrs
values = {
"name": network_attributes.get("Name"),
"scope": network_attributes.get("Scope", "local"),
"driver": network_attributes.get("Driver", None),
"enable_ipv6": network_attributes.get("EnableIPv6", False),
"internal": network_attributes.get("Internal", False),
"ipam": {
"driver": network_attributes.get("IPAM", {}).get("Driver", "default"),
"config": [
{key.lower(): value for key, value in config.items()}
for config in network_attributes.get("IPAM", {}).get("Config", [])
],
},
}
networks[network_name] = {key: value for key, value in values.items()}
return networks
def main():
parser = argparse.ArgumentParser(
description="Generate docker-compose yaml definition from running container.",
)
parser.add_argument(
"-a",
"--all",
action="store_true",
help="Include all active containers",
)
parser.add_argument(
"-v",
"--version",
type=int,
default=3,
help="Compose file version (1 or 3)",
)
parser.add_argument(
"cnames",
nargs="*",
type=str,
help="The name of the container to process.",
)
parser.add_argument(
"-c",
"--createvolumes",
action="store_true",
help="Create new volumes instead of reusing existing ones",
)
parser.add_argument(
"-f",
"--filter",
type=str,
help="Filter containers by regex",
)
args = parser.parse_args()
container_names = args.cnames
if args.all:
container_names.extend(list_container_names())
if args.filter:
cfilter = re.compile(args.filter)
container_names = [c for c in container_names if cfilter.search(c)]
struct = {}
networks = {}
volumes = {}
containers = {}
for cname in container_names:
cfile, c_networks, c_volumes = generate(cname, createvolumes=args.createvolumes)
struct.update(cfile)
if not c_networks == None:
networks.update(c_networks)
if not c_volumes == None:
volumes.update(c_volumes)
# moving the networks = None statements outside of the for loop. Otherwise any container could reset it.
if len(networks) == 0:
networks = None
if len(volumes) == 0:
volumes = None
if args.all:
host_networks = generate_network_info()
networks = host_networks
render(struct, args, networks, volumes)
def render(struct, args, networks, volumes):
# Render yaml file
if args.version == 1:
pyaml.p(OrderedDict(struct))
else:
ans = {"version": '3.6', "services": struct}
if networks is not None:
ans["networks"] = networks
if volumes is not None:
ans["volumes"] = volumes
pyaml.p(OrderedDict(ans), string_val_style='"')
def generate(cname, createvolumes=False):
c = docker.from_env()
try:
cid = [x.short_id for x in c.containers.list(all=True) if cname == x.name or x.short_id in cname][0]
except IndexError:
print("That container is not available.", file=sys.stderr)
sys.exit(1)
cattrs = c.containers.get(cid).attrs
# Build yaml dict structure
cfile = {}
cfile[cattrs.get("Name")[1:]] = {}
ct = cfile[cattrs.get("Name")[1:]]
default_networks = ["bridge", "host", "none"]
values = {
"cap_drop": cattrs.get("HostConfig", {}).get("CapDrop", None),
"cgroup_parent": cattrs.get("HostConfig", {}).get("CgroupParent", None),
"container_name": cattrs.get("Name")[1:],
"devices": [],
"dns": cattrs.get("HostConfig", {}).get("Dns", None),
"dns_search": cattrs.get("HostConfig", {}).get("DnsSearch", None),
"environment": cattrs.get("Config", {}).get("Env", None),
"extra_hosts": cattrs.get("HostConfig", {}).get("ExtraHosts", None),
"image": cattrs.get("Config", {}).get("Image", None),
"labels": cattrs.get("Config", {}).get("Labels", {}),
"links": cattrs.get("HostConfig", {}).get("Links"),
#'log_driver': cattrs.get('HostConfig']['LogConfig']['Type'],
#'log_opt': cattrs.get('HostConfig']['LogConfig']['Config'],
"logging": {
"driver": cattrs.get("HostConfig", {}).get("LogConfig", {}).get("Type", None),
"options": cattrs.get("HostConfig", {}).get("LogConfig", {}).get("Config", None),
},
"networks": {
x for x in cattrs.get("NetworkSettings", {}).get("Networks", {}).keys() if x not in default_networks
},
"security_opt": cattrs.get("HostConfig", {}).get("SecurityOpt"),
"ulimits": cattrs.get("HostConfig", {}).get("Ulimits"),
# the line below would not handle type bind
# 'volumes': [f'{m["Name"]}:{m["Destination"]}' for m in cattrs.get('Mounts'] if m['Type'] == 'volume'],
"mounts": cattrs.get("Mounts"), # this could be moved outside of the dict. will only use it for generate
"volume_driver": cattrs.get("HostConfig", {}).get("VolumeDriver", None),
"volumes_from": cattrs.get("HostConfig", {}).get("VolumesFrom", None),
"entrypoint": cattrs.get("Config", {}).get("Entrypoint", None),
"user": cattrs.get("Config", {}).get("User", None),
"working_dir": cattrs.get("Config", {}).get("WorkingDir", None),
"domainname": cattrs.get("Config", {}).get("Domainname", None),
"hostname": cattrs.get("Config", {}).get("Hostname", None),
"ipc": cattrs.get("HostConfig", {}).get("IpcMode", None),
"mac_address": cattrs.get("NetworkSettings", {}).get("MacAddress", None),
"privileged": cattrs.get("HostConfig", {}).get("Privileged", None),
"restart": cattrs.get("HostConfig", {}).get("RestartPolicy", {}).get("Name", None),
"read_only": cattrs.get("HostConfig", {}).get("ReadonlyRootfs", None),
"stdin_open": cattrs.get("Config", {}).get("OpenStdin", None),
"tty": cattrs.get("Config", {}).get("Tty", None),
}
# Populate devices key if device values are present
if cattrs.get("HostConfig", {}).get("Devices"):
values["devices"] = [
x["PathOnHost"] + ":" + x["PathInContainer"] for x in cattrs.get("HostConfig", {}).get("Devices")
]
networks = {}
if values["networks"] == set():
del values["networks"]
if len(cattrs.get("NetworkSettings", {}).get("Networks", {}).keys()) > 0:
assumed_default_network = list(cattrs.get("NetworkSettings", {}).get("Networks", {}).keys())[0]
values["network_mode"] = assumed_default_network
networks = None
else:
networklist = c.networks.list()
for network in networklist:
if network.attrs["Name"] in values["networks"]:
networks[network.attrs["Name"]] = {
"external": (not network.attrs["Internal"]),
"name": network.attrs["Name"],
}
# volumes = {}
# if values['volumes'] is not None:
# for volume in values['volumes']:
# volume_name = volume.split(':')[0]
# volumes[volume_name] = {'external': True}
# else:
# volumes = None
# handles both the returned values['volumes'] (in c_file) and volumes for both, the bind and volume types
# also includes the read only option
volumes = {}
mountpoints = []
if values["mounts"] is not None:
for mount in values["mounts"]:
destination = mount["Destination"]
if not mount["RW"]:
destination = destination + ":ro"
if mount["Type"] == "volume":
mountpoints.append(mount["Name"] + ":" + destination)
if not createvolumes:
volumes[mount["Name"]] = {
"external": True
} # to reuse an existing volume ... better to make that a choice? (cli argument)
elif mount["Type"] == "bind":
mountpoints.append(mount["Source"] + ":" + destination)
values["volumes"] = sorted(mountpoints)
if len(volumes) == 0:
volumes = None
values["mounts"] = None # remove this temporary data from the returned data
# Check for command and add it if present.
if cattrs.get("Config", {}).get("Cmd") is not None:
values["command"] = cattrs.get("Config", {}).get("Cmd")
# Check for exposed/bound ports and add them if needed.
try:
expose_value = list(cattrs.get("Config", {}).get("ExposedPorts", {}).keys())
ports_value = [
cattrs.get("HostConfig", {}).get("PortBindings", {})[key][0]["HostIp"]
+ ":"
+ cattrs.get("HostConfig", {}).get("PortBindings", {})[key][0]["HostPort"]
+ ":"
+ key
for key in cattrs.get("HostConfig", {}).get("PortBindings")
]
# If bound ports found, don't use the 'expose' value.
if ports_value not in IGNORE_VALUES:
for index, port in enumerate(ports_value):
if port[0] == ":":
ports_value[index] = port[1:]
values["ports"] = ports_value
else:
values["expose"] = expose_value
except (KeyError, TypeError):
# No ports exposed/bound. Continue without them.
ports = None
# fixup strings in labels and env
if values["labels"] is not None:
for label_key, label_value in values["labels"].items():
values["labels"][label_key] = shell_escape_string(label_value)
if values["environment"] is not None:
for idx, env_variable in enumerate(values["environment"]):
values["environment"][idx] = shell_escape_string(env_variable)
# Iterate through values to finish building yaml dict.
for key in values:
value = values[key]
if value not in IGNORE_VALUES:
ct[key] = value
return cfile, networks, volumes
if __name__ == "__main__":
main()