Github Actions for PR

This commit is contained in:
Julien Bisconti 2020-04-13 16:15:18 +02:00
parent 9e09f33925
commit bf2e6ae389
No known key found for this signature in database
GPG Key ID: 62772C6698F736CB
8 changed files with 381 additions and 29 deletions

View File

@ -1,22 +1,24 @@
module.exports = {
env: {
browser: true,
node: true
node: true,
},
extends: [
'airbnb-base',
'plugin:import/errors',
'plugin:import/warnings',
'prettier'
'prettier',
'eslint:recommended',
],
plugins: ['import', 'prettier'],
rules: {
camelcase: 0,
'import/order': [
'error',
{
groups: ['builtin', 'external', 'parent', 'sibling', 'index'],
'newlines-between': 'never'
}
'newlines-between': 'never',
},
],
'no-console': 0,
'prefer-template': 2,
@ -24,8 +26,8 @@ module.exports = {
'error',
{
singleQuote: true,
trailingComma: 'all'
}
]
}
trailingComma: 'all',
},
],
},
};

30
.github/workflow/pull_request.yml vendored Normal file
View File

@ -0,0 +1,30 @@
name: Pull Requests
on:
pull_request:
branches:
- master
jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@01aecccf739ca6ff86c0539fbc67a7a5007bbc81
- uses: actions/setup-node@83c9f7a7df54d6b57455f7c57ac414f2ae5fb8de
with:
node-version: 12
- uses: actions/cache@70655ec8323daeeaa7ef06d7c56e1b9191396cbe
id: cache
with:
path: ~/.npm
key: ${{ runner.os }}-node-${{ hashFiles('**/package-lock.json') }}
restore-keys: |
${{ runner.os }}-node-
- name: Install Dependencies
if: steps.cache.outputs.cache-hit != 'true'
run: npm ci --ignore-scripts --no-audit --no-progress --prefer-offline
- run: npm run test
with:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

View File

@ -17,7 +17,7 @@ If this list is not complete, you can [contribute][editreadme] to make it so. He
The creators and maintainers of this list do not receive any form of payment to accept a change made by any contributor. This page is not an official Docker product in any way. It is a list of links to projects and is maintained by volunteers. Everybody is welcome to contribute. The goal of this repo is to index open-source projects, not to advertise for profit.
All the links are monitored and tested with [awesome_bot](https://github.com/dkhamsing/awesome_bot) made by [@dkhamsing](https://github.com/dkhamsing)
# Contents <!-- omit in toc -->
<!-- TOC -->
@ -95,7 +95,7 @@ _Source:_ [What is Docker](https://www.docker.com/why-docker)
- [Docker Curriculum](https://github.com/prakhar1989/docker-curriculum): A comprehensive tutorial for getting started with Docker. Teaches how to use Docker and deploy dockerized apps on AWS with Elastic Beanstalk and Elastic Container Service.
- [Docker Documentation](https://docs.docker.com/): the official documentation.
- [Docker for beginners](https://github.com/groda/big_data/blob/master/docker_for_beginners.md): A tutorial for beginners who need to learn the basics of Docker—from "Hello world!" to basic interactions with containers, with simple explanations of the underlying concepts.
- [Docker for novices](https://www.youtube.com/watch?v=xsjSadjKXns) An introduction to Docker for developers and testers who have never used it. (Video 1h40, recorded linux.conf.au 2019 — Christchurch, New Zealand) by Alex Clews.
- [Docker for novices](https://www.youtube.com/watch?v=xsjSadjKXns) An introduction to Docker for developers and testers who have never used it. (Video 1h40, recorded linux.conf.au 2019 — Christchurch, New Zealand) by Alex Clews.
- [Docker Training](https://success.docker.com/training) :heavy_dollar_sign:
- [Docker Tutorial for Beginners (Updated 2019 version)](https://hashnode.com/post/docker-tutorial-for-beginners-cjrj2hg5001s2ufs1nker9he2) — In this Docker tutorial, you'll learn all the basics and learn how you can containerize Node.js and Go applications. Even if you aren't familiar with these languages it should be easy for you to follow this tutorial and use any other language.
- [Katacoda](https://www.katacoda.com/courses/docker): Learn Docker using Interactive Browser-Based Labs
@ -637,7 +637,7 @@ Services to securely store your Docker images.
- [AppDynamics](https://www.appdynamics.com/community/exchange/extension/docker-monitoring-extension/) :heavy_dollar_sign: - AppDynamics gives enterprises real-time insights into application performance, user performance, and business performance so they can move faster in an increasingly sophisticated, software-driven world.
- [Axibase Time-Series Database](https://axibase.com/products/axibase-time-series-database/writing-data/docker-cadvisor/) :heavy_dollar_sign: - Long-term retention of container statistics and built-in dashboards for Docker. Collected with native Google cAdvisor storage driver.
- [Broadcom Docker Monitoring](https://www.broadcom.com/info/aiops/docker-monitoring) :heavy_dollar_sign: - Agile Operations solutions from Broadcom deliver the modern Docker monitoring businesses need to accelerate and optimize the performance of microservices and the dynamic Docker environments running them. Monitor both the Docker environment and apps that run inside them. (former CA Technologies)
- [Collecting docker logs and stats with Splunk](https://www.splunk.com/en_us/blog/cloud/collecting-docker-logs-and-stats-with-splunk.html )
- [Collecting docker logs and stats with Splunk](https://www.splunk.com/en_us/blog/cloud/collecting-docker-logs-and-stats-with-splunk.html)
- [Datadog](https://www.datadoghq.com/) :heavy_dollar_sign: - Datadog is a full-stack monitoring service for large-scale cloud environments that aggregates metrics/events from servers, databases, and applications. It includes support for Docker, Kubernetes, and Mesos.
- [Prometheus](https://prometheus.io/) :heavy_dollar_sign: - Open-source service monitoring system and time series database
- [Site24x7](https://www.site24x7.com/docker-monitoring.html) :heavy_dollar_sign: - Docker Monitoring for DevOps and IT is a SaaS Pay per Host model
@ -666,13 +666,11 @@ Services to securely store your Docker images.
- [Awesome Sysadmin](https://github.com/n1trux/awesome-sysadmin) by [@n1trux](https://github.com/n1trux)
- [ToolsOfTheTrade](https://github.com/cjbarber/ToolsOfTheTrade) a list of SaaS and On premise applications by [@cjbarber](https://github.com/cjbarber)
## Demos and Examples
- [Webstack-micro](https://github.com/ferbs/webstack-micro) Demo web app showing how Docker Compose might be used to set up an API Gateway, centralized authentication, background workers, and WebSockets as containerized services.
- [An Annotated Docker Config for Frontend Web Development](https://nystudio107.com/blog/an-annotated-docker-config-for-frontend-web-development) A local development environment with Docker allows you to shrink-wrap the devops your project needs as config, making onboarding frictionless.
## Good Tips
- [Dealing with linked containers dependency in docker-compose](http://brunorocha.org/python/dealing-with-linked-containers-dependency-in-docker-compose.html) by [@rochacbruno](https://github.com/rochacbruno)

View File

@ -2,7 +2,6 @@ const fs = require('fs-extra');
const cheerio = require('cheerio');
const showdown = require('showdown');
const Parcel = require('parcel-bundler');
// const sm = require('sitemap');
const { SitemapStream, streamToPromise } = require('sitemap');
process.env.NODE_ENV = 'production';
@ -13,7 +12,7 @@ const LOG = {
if (process.env.DEBUG) console.log('💡 DEBUG: ', { ...args });
},
};
const handleFailure = err => {
const handleFailure = (err) => {
LOG.error(err);
process.exit(1);
};
@ -90,7 +89,7 @@ const bundle = () => {
smStream.end();
return streamToPromise(smStream);
})
.then(sm =>
.then((sm) =>
// Creates a sitemap object given the input configuration with URLs
fs.outputFile(
'dist/sitemap.xml',

132
old_build_beta.js Normal file
View File

@ -0,0 +1,132 @@
const fs = require('fs-extra');
const fetch = require('node-fetch');
require('draftlog').into(console);
const LOG = {
error: (...args) => console.error(' ERROR', { ...args }),
debug: (...args) => {
if (process.env.DEBUG) console.log('💡 DEBUG: ', { ...args });
},
};
const handleFailure = (err) => {
LOG.error(err);
process.exit(1);
};
process.on('unhandledRejection', handleFailure);
if (!process.env.GITHUB_TOKEN) {
LOG.error('no credentials found.');
process.exit(1);
}
const TOKEN = process.env.GITHUB_TOKEN;
// --- ENV VAR ---
const BATCH_SIZE = parseInt(process.env.BATCH_SIZE, 10) || 10;
const DELAY = parseInt(process.env.DELAY, 10) || 3000;
const INTERVAL = parseInt(process.env.INTERVAL, 10) || 1;
const INTERVAL_UNIT = process.env.INTERVAL_UNIT || 'days';
// --- FILES ---
const DATA_FOLDER = 'data';
const README = 'README.md';
const LATEST_FILENAME = `${DATA_FOLDER}/latest`;
const GITHUB_REPOS = `${DATA_FOLDER}/repository.json`;
const Authorization = `token ${TOKEN}`;
// --- HTTP ---
const API = 'https://api.github.com/';
const options = {
method: 'GET',
headers: {
'User-Agent': 'awesome-docker script listing',
'Content-Type': 'application/json',
Authorization,
},
};
// ----------------------------------------------------------------------------
const removeHost = (x) => x.slice('https://github.com/'.length, x.length);
const delay = (ms) =>
new Promise((resolve) => {
setTimeout(() => resolve(), ms);
});
const get = (pathURL, opt) => {
LOG.debug(`Fetching ${pathURL}`);
return fetch(`${API}repos/${pathURL}`, {
...options,
...opt,
})
.catch(handleFailure)
.then((response) => {
if (response.ok) return response.json();
throw new Error('Network response was not ok.');
})
.catch(handleFailure);
};
const fetchAll = (batch) =>
Promise.all(batch.map(async (pathURL) => get(pathURL)));
const extractAllLinks = (markdown) => {
const re = /((([A-Za-z]{3,9}:(?:\/\/)?)(?:[\-;:&=\+\$,\w]+@)?[A-Za-z0-9\.\-]+|(?:www\.|[\-;:&=\+\$,\w]+@)[A-Za-z0-9\.\-]+)((?:\/[\+~%\/\.\w\-_]*)?\??(?:[\-\+=&;%@\.\w_]*)#?(?:[\.\!\/\\\w]*))?)/g;
return markdown.match(re);
};
const extractAllRepos = (markdown) => {
const re = /https:\/\/github\.com\/([a-zA-Z0-9-._]+)\/([a-zA-Z0-9-._]+)/g;
const md = markdown.match(re);
return [...new Set(md)];
};
const ProgressBar = (i, batchSize, total) => {
const progress = Math.round((i / total) * 100);
const units = Math.round(progress / 2);
const barLine = console.draft('Starting batch...');
return barLine(
`[${'='.repeat(units)}${' '.repeat(50 - units)}] ${progress}% - # ${i}`,
);
};
// ----------------------------------------------------------------------------
async function batchFetchRepoMetadata(githubRepos) {
const repos = githubRepos.map(removeHost);
const metadata = [];
/* eslint-disable no-await-in-loop */
for (let i = 0; i < repos.length; i += BATCH_SIZE) {
const batch = repos.slice(i, i + BATCH_SIZE);
LOG.debug({ batch });
const res = await fetchAll(batch);
LOG.debug('batch fetched...');
metadata.push(...res);
ProgressBar(i, BATCH_SIZE, repos.length);
// poor man's rate limiting so github doesn't ban us
await delay(DELAY);
}
ProgressBar(repos.length, BATCH_SIZE, repos.length);
return metadata;
}
async function main() {
try {
const markdown = await fs.readFile(README, 'utf8');
const links = extractAllLinks(markdown);
const githubRepos = extractAllRepos(markdown);
LOG.debug('writing repo list to disk...');
await fs.outputJSON(GITHUB_REPOS, githubRepos, { spaces: 2 });
LOG.debug('fetching data...');
const metadata = await batchFetchRepoMetadata(githubRepos);
LOG.debug('gracefully shutting down.');
process.exit();
} catch (err) {
handleFailure(err);
}
}
main();

33
package-lock.json generated
View File

@ -1229,6 +1229,14 @@
"resolved": "https://registry.npmjs.org/atob/-/atob-2.1.2.tgz",
"integrity": "sha512-Wm6ukoaOGJi/73p/cl2GvLjTI5JM1k/O14isD73YML8StrH/7/lRFgmg8nICZgD3bZZvjwCGxtMOD3wWNAu8cg=="
},
"awesome-readme-to-data": {
"version": "0.0.3",
"resolved": "https://registry.npmjs.org/awesome-readme-to-data/-/awesome-readme-to-data-0.0.3.tgz",
"integrity": "sha512-DBbPrfFxk/TGgIzzjpraiAf8zssav50E3ufMnleFv9fr28khbnAZlAQx153KQDQVbAEWt2EJfuwJdzEYRKM7/Q==",
"requires": {
"marked": "^0.8.2"
}
},
"aws-sign2": {
"version": "0.7.0",
"resolved": "https://registry.npmjs.org/aws-sign2/-/aws-sign2-0.7.0.tgz",
@ -4878,6 +4886,11 @@
"object-visit": "^1.0.0"
}
},
"marked": {
"version": "0.8.2",
"resolved": "https://registry.npmjs.org/marked/-/marked-0.8.2.tgz",
"integrity": "sha512-EGwzEeCcLniFX51DhTpmTom+dSA/MG/OBUDjnWtHbEnjAH180VzUeAw+oE4+Zv+CoYBWyRlYOTR0N8SO9R1PVw=="
},
"md5.js": {
"version": "1.3.5",
"resolved": "https://registry.npmjs.org/md5.js/-/md5.js-1.3.5.tgz",
@ -5008,18 +5021,11 @@
}
},
"mkdirp": {
"version": "0.5.1",
"resolved": "https://registry.npmjs.org/mkdirp/-/mkdirp-0.5.1.tgz",
"integrity": "sha1-MAV0OOrGz3+MR2fzhkjWaX11yQM=",
"version": "0.5.5",
"resolved": "https://registry.npmjs.org/mkdirp/-/mkdirp-0.5.5.tgz",
"integrity": "sha512-NKmAlESf6jMGym1++R0Ra7wvhV+wFW63FaSOFPwRahvea0gMUcGUhVeAg/0BC0wiv9ih5NYPB1Wn1UEI1/L+xQ==",
"requires": {
"minimist": "0.0.8"
},
"dependencies": {
"minimist": {
"version": "0.0.8",
"resolved": "https://registry.npmjs.org/minimist/-/minimist-0.0.8.tgz",
"integrity": "sha1-hX/Kv8M5fSYluCKCYuhqp6ARsF0="
}
"minimist": "^1.2.5"
}
},
"ms": {
@ -5073,6 +5079,11 @@
"resolved": "https://registry.npmjs.org/node-addon-api/-/node-addon-api-1.7.1.tgz",
"integrity": "sha512-2+DuKodWvwRTrCfKOeR24KIc5unKjOh8mz17NCzVnHWfjAdDqbfbjqh7gUT+BkXBRQM52+xCHciKWonJ3CbJMQ=="
},
"node-fetch": {
"version": "2.6.0",
"resolved": "https://registry.npmjs.org/node-fetch/-/node-fetch-2.6.0.tgz",
"integrity": "sha512-8dG4H5ujfvFiqDmVu9fQ5bOHUC15JMjMY/Zumv26oOvvVJjM67KF8koCWIabKQ1GJIa9r2mMZscBq/TbdOcmNA=="
},
"node-forge": {
"version": "0.7.6",
"resolved": "https://registry.npmjs.org/node-forge/-/node-forge-0.7.6.tgz",

View File

@ -4,7 +4,8 @@
"description": "A curated list of Docker resources and projects Inspired by @sindresorhus and improved by amazing contributors",
"main": "build.js",
"scripts": {
"build": "rimraf ./dist/ && node build.js"
"build": "rimraf ./dist/ && node build.js",
"test": "node pull_request.js"
},
"repository": {
"type": "git",
@ -17,9 +18,11 @@
},
"homepage": "https://github.com/veggiemonk/awesome-docker#readme",
"dependencies": {
"awesome-readme-to-data": "0.0.3",
"cheerio": "1.0.0-rc.3",
"draftlog": "1.0.12",
"fs-extra": "9.0.0",
"node-fetch": "2.6.0",
"parcel-bundler": "1.12.4",
"rimraf": "3.0.2",
"showdown": "1.9.1",
@ -36,4 +39,4 @@
"minimist": "1.2.5",
"prettier": "2.0.4"
}
}
}

177
pull_request.js Normal file
View File

@ -0,0 +1,177 @@
const fs = require('fs-extra');
const fetch = require('node-fetch');
function envvar_undefined(variable_name) {
throw new Error(`${variable_name} must be defined`);
}
console.log({
DEBUG: process.env.DEBUG,
});
const README = 'README.md';
const GITHUB_GQL_API = 'https://api.github.com/graphql';
const TOKEN = process.env.GITHUB_TOKEN || envvar_undefined('GITHUB_TOKEN');
const LINKS_OPTIONS = {
redirect: 'error',
headers: {
'Content-Type': 'application/json',
'user-agent':
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/80.0.3987.149 Safari/537.36',
},
};
const Authorization = `token ${TOKEN}`;
const make_GQL_options = (query) => ({
method: 'POST',
headers: {
Authorization,
'Content-Type': 'application/json',
'user-agent':
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/80.0.3987.149 Safari/537.36',
},
body: JSON.stringify({ query }),
});
const LOG = {
error: (...args) => console.error('❌ ERROR', { ...args }),
error_string: (...args) =>
console.error('❌ ERROR', JSON.stringify({ ...args })),
debug: (...args) => {
if (process.env.DEBUG) console.log('>>> DEBUG: ', { ...args });
},
debug_string: (...args) => {
if (process.env.DEBUG)
console.log('>>> DEBUG: ', JSON.stringify({ ...args }));
},
};
const handleFailure = (error) => {
console.error(`${error.message}: ${error.stack}`, { error });
process.exit(1);
};
process.on('unhandledRejection', handleFailure);
const extract_all_links = (markdown) => {
// if you have a problem and you try to solve it with a regex,
// now you have two problems
const re = /(((https:(?:\/\/)?)(?:[-;:&=+$,\w]+@)?[A-Za-z0-9.-]+|(?:www\.|[-;:&=+$,\w]+@)[A-Za-z0-9.-]+)((?:\/[+~%/.\w\-_]*)?\??(?:[-+=&;%@.\w_]*)#?(?:[.!/\\\w]*))?)/g;
return markdown.match(re);
};
const find_duplicates = (arr) => {
const hm = {};
const dup = [];
arr.forEach((e) => {
if (hm[e]) dup.push(e);
else hm[e] = null;
});
return dup;
};
const partition = (arr, func) => {
const ap = [[], []];
arr.forEach((e) => (func(e) ? ap[0].push(e) : ap[1].push(e)));
return ap;
};
async function fetch_link(url) {
try {
const { ok, statusText, redirected } = await fetch(url, LINKS_OPTIONS);
return [url, { ok, status: statusText, redirected }];
} catch (error) {
return [url, { ok: false, status: error.message }];
}
}
async function batch_fetch({ arr, get, post_filter_func, BATCH_SIZE = 8 }) {
const result = [];
/* eslint-disable no-await-in-loop */
for (let i = 0; i < arr.length; i += BATCH_SIZE) {
const batch = arr.slice(i, i + BATCH_SIZE);
LOG.debug({ batch });
let res = await Promise.all(batch.map(get));
LOG.debug('batch fetched...');
res = post_filter_func ? res.filter(post_filter_func) : res;
LOG.debug_string({ res });
result.push(...res);
}
return result;
}
const extract_repos = (arr) =>
arr
.map((e) => e.substr('https://github.com/'.length).split('/'))
.filter((r) => r.length === 2 && r[1] !== '');
const generate_GQL_query = (arr) =>
`query AWESOME_REPOS{ ${arr
.map(
([owner, name]) =>
`repo_${owner.replace(/(-|\.)/g, '_')}_${name.replace(
/(-|\.)/g,
'_',
)}: repository(owner: "${owner}", name:"${name}"){ nameWithOwner } `,
)
.join('')} }`;
// =============================================================
// const batch_github_repos = async (github_links) => {
// const BATCH_SIZE = 50;
// const repos = extract_repos(github_links);
// for (let i = 0; i < repos.length; i += BATCH_SIZE) {
// const batch = repos.slice(i, i + BATCH_SIZE);
// const query = generate_GQL_query(batch);
// LOG.debug({ query });
// const gql_response = await fetch(
// 'https://api.github.com/graphql',
// make_GQL_options(query),
// ).then((r) => r.json());
// LOG.debug({ gql_response });
// }
// };
// =============================================================
async function main() {
const markdown = await fs.readFile(README, 'utf8');
const links = extract_all_links(markdown);
const duplicates = find_duplicates(links);
if (duplicates.length > 0) {
LOG.error_string({ duplicates });
}
const [github_links, other_links] = partition(links, (link) =>
link.startsWith('https://github.com'),
);
const other_links_error = await batch_fetch({
arr: other_links,
get: fetch_link,
post_filter_func: (x) => !x[1].ok,
BATCH_SIZE: 8,
});
if (other_links_error.length > 0) {
LOG.error_string({ other_links_error });
}
const repos = extract_repos(github_links);
const query = generate_GQL_query(repos);
const options = make_GQL_options(query);
const gql_response = await fetch(GITHUB_GQL_API, options).then((r) =>
r.json(),
);
const { data } = gql_response;
if (gql_response.errors) {
LOG.error_string({ errors: gql_response.errors });
}
const repos_fetched = Object.entries(data)
.map(([, /* k , */ v]) => v.nameWithOwner)
.sort((a, b) => b - a);
console.log({ repos_fetched: repos_fetched.length });
}
console.log('starting...');
main();