commit
stringlengths 40
40
| old_file
stringlengths 4
237
| new_file
stringlengths 4
237
| old_contents
stringlengths 1
4.24k
| new_contents
stringlengths 1
4.87k
| subject
stringlengths 15
778
| message
stringlengths 15
8.75k
| lang
stringclasses 266
values | license
stringclasses 13
values | repos
stringlengths 5
127k
|
---|---|---|---|---|---|---|---|---|---|
2df4e95e90e99860fc730e26d48cad499ac67df2
|
.travis.yml
|
.travis.yml
|
language: csharp
sudo: required
dist: trusty
mono: none
dotnet: 2.0.0
env:
global:
- marten_testing_database="Host=localhost;Port=5432;Database=marten_test;Username=postgres;Password=Password12!"
addons:
postgresql: "9.5"
before_install:
- sudo apt-get install postgresql-9.5-plv8
- nvm install 6.0.0
install:
- export DOTNET_CLI_TELEMETRY_OPTOUT=1
- export DOTNET_SKIP_FIRST_TIME_EXPERIENCE=1
- travis_retry npm install
before_script:
- psql -c 'create database marten_test;' -U postgres
- psql -d marten_test -c 'create extension if not exists plv8;'
- chmod a+x ./build.sh
script:
- ./build.sh
|
language: csharp
sudo: required
dist: trusty
mono: none
dotnet: 2.1.400
env:
global:
- marten_testing_database="Host=localhost;Port=5432;Database=marten_test;Username=postgres;Password=Password12!"
addons:
postgresql: "9.5"
before_install:
- sudo apt-get install postgresql-9.5-plv8
- nvm install 6.0.0
install:
- export DOTNET_CLI_TELEMETRY_OPTOUT=1
- export DOTNET_SKIP_FIRST_TIME_EXPERIENCE=1
- travis_retry npm install
before_script:
- psql -c 'create database marten_test;' -U postgres
- psql -d marten_test -c 'create extension if not exists plv8;'
- chmod a+x ./build.sh
script:
- ./build.sh
|
Update dotnet sdk version to 2.1.400 for TravisCI [skip appveyor]
|
Update dotnet sdk version to 2.1.400 for TravisCI [skip appveyor]
|
YAML
|
mit
|
mysticmind/marten,mysticmind/marten,mdissel/Marten,ericgreenmix/marten,mdissel/Marten,ericgreenmix/marten,mysticmind/marten,JasperFx/Marten,JasperFx/Marten,JasperFx/Marten,ericgreenmix/marten,mysticmind/marten,ericgreenmix/marten
|
8f5ce745cf0da839c9ec170edd71324e0a34d154
|
.travis.yml
|
.travis.yml
|
language: python
python:
- 2.7
env:
- CFLAGS=-I/usr/include/gdal
before_install:
- sudo add-apt-repository -y ppa:ubuntugis/ubuntugis-unstable
- sudo add-apt-repository -y ppa:bkgaley/ppa
- sudo apt-get -qq update
- sudo apt-get -y install libgdal-dev libgdal1h libboost1.58-all-dev gdal-bin
install:
- pip install --global-option=build_ext --global-option='-USQLITE_OMIT_LOAD_EXTENSION' pysqlite
- pip install GDAL==1.10.0
- pip install -r requirements.txt
- pip install coveralls
script: make coverage
after_success: coveralls
|
language: python
python:
- 2.7
env:
- CFLAGS=-I/usr/include/gdal
before_install:
- sudo add-apt-repository -y ppa:ubuntugis/ubuntugis-unstable
- sudo add-apt-repository -y ppa:bkgaley/ppa
- sudo apt-get -qq update
- sudo apt-get -y install libgdal-dev libgdal1h gdal-bin
libboost-atomic1.58-dev libboost-atomic1.58.0 libboost-chrono1.58-dev
libboost-chrono1.58.0 libboost-date-time1.58-dev
libboost-filesystem1.58-dev libboost-program-options1.58-dev
libboost-python1.58-dev libboost-regex1.58-dev
libboost-serialization1.58-dev libboost-system1.58-dev
libboost-thread1.58-dev libboost1.58-dev libcairo-script-interpreter2
libcairo2-dev libfontconfig1-dev libfreetype6-dev libglib2.0-dev
libjbig-dev libpixman-1-dev libtiff4-dev libxcb-shm0-dev libxrender-dev
x11proto-render-dev
install:
- pip install --global-option=build_ext --global-option='-USQLITE_OMIT_LOAD_EXTENSION' pysqlite
- pip install GDAL==1.10.0
- pip install -r requirements.txt
- pip install coveralls
script: make coverage
after_success: coveralls
|
Add remaining build deps from python-mapnik
|
Add remaining build deps from python-mapnik
|
YAML
|
bsd-3-clause
|
bkg/django-spillway
|
1bc2652319e409b9128e632b3fa046d1d9355ae6
|
.travis.yml
|
.travis.yml
|
language: php
php:
- '5.6'
- '7.0'
- '7.1'
- hhvm
- nightly
sudo: false
cache:
directories:
- $HOME/.composer/cache
before_script:
- composer install --dev --prefer-source --no-interaction
|
language: php
php:
- '5.6'
- '7.0'
- '7.1'
- nightly
sudo: false
cache:
directories:
- $HOME/.composer/cache
before_script:
- composer install --prefer-source --no-interaction
|
Remove hhmv, remove --dev flag from composer
|
Remove hhmv, remove --dev flag from composer
|
YAML
|
mit
|
rjvandoesburg/laravel-jira-webhook
|
e2f66099b8e5435b51688284182638e817648cf2
|
.travis.yml
|
.travis.yml
|
language: java
jdk:
- oraclejdk8
before_cache:
- rm -f $HOME/.gradle/caches/modules-2/modules-2.lock
- rm -fr $HOME/.gradle/caches/*/plugin-resolution/
cache:
directories:
- $HOME/.gradle/caches/
- $HOME/.gradle/wrapper/
script:
- ./gradlew build --console plain
|
language: java
jdk:
- openjdk8
before_cache:
- rm -f $HOME/.gradle/caches/modules-2/modules-2.lock
- rm -fr $HOME/.gradle/caches/*/plugin-resolution/
cache:
directories:
- $HOME/.gradle/caches/
- $HOME/.gradle/wrapper/
script:
- ./gradlew build --console plain
|
Switch to openjdk for Travis checks.
|
Switch to openjdk for Travis checks.
Travis started switching to Ubuntu 16.04 (Xenial) from 14.04 (Trusty)
for all builds. The Oracle JDK does not come pre-installed any more and
is causing the issue during install:
```
Installing oraclejdk8
$ export JAVA_HOME=~/oraclejdk8
$ export PATH="$JAVA_HOME/bin:$PATH"
$ ~/bin/install-jdk.sh --target "/home/travis/oraclejdk8" --workspace
"/home/travis/.cache/install-jdk" --feature "8" --license "BCL"
install-jdk.sh 2019-05-02
Expected feature release number in range of 9 to 13, but got: 8
The command "~/bin/install-jdk.sh --target "/home/travis/oraclejdk8"
--workspace "/home/travis/.cache/install-jdk" --feature "8" --license
"BCL"" failed and exited with 3 during .
Your build has been stopped.
```
One option is to set `dist: trusty` but that's an EOL distro at this
point. Switching to openjdk feels safer choice at this point.
References:
* https://blog.travis-ci.com/2019-04-15-xenial-default-build-environment
* https://travis-ci.community/t/install-of-oracle-jdk-8-failing/3038
|
YAML
|
apache-2.0
|
sixninetynine/pygradle,sixninetynine/pygradle,sixninetynine/pygradle,sixninetynine/pygradle
|
f3fe18b8138c43f2a8c33970e6c3a03c3110b9a9
|
.travis.yml
|
.travis.yml
|
language: swift
os:
- osx
script:
- swift build
- swift test
|
language: swift
os:
- osx
osx_image: xcode8.3
script:
- swift build
- swift test
|
Use Xcode 8.3 image on macOS
|
[Travis] Use Xcode 8.3 image on macOS
|
YAML
|
mit
|
iosdevzone/Emarcam,iosdevzone/Emarcam,iosdevzone/Emarcam
|
afd2ffe539018ae24767b223acecabd6c95c4f8c
|
.travis.yml
|
.travis.yml
|
language: ruby
cache: bundler
rvm:
- 2.2
- 2.3
- 2.4
- 2.5
- 2.6
- 2.7
env:
matrix:
- RAILS_VERSION='~> 4.2.0' SQLITE_VERSION='~> 1.3.6'
- RAILS_VERSION='~> 5.0.0' SQLITE_VERSION='~> 1.3.6'
- RAILS_VERSION='~> 5.1.0'
- RAILS_VERSION='~> 5.2.0'
- RAILS_VERSION='~> 6.0.0'
matrix:
exclude:
- rvm: 2.2
env: RAILS_VERSION='~> 6.0.0'
- rvm: 2.3
env: RAILS_VERSION='~> 6.0.0'
- rvm: 2.4
env: RAILS_VERSION='~> 6.0.0'
# Rails 4.2 uses BigDecimal.new, which Ruby 2.7 removed
- rvm: 2.7
env: RAILS_VERSION='~> 4.2.0'
|
language: ruby
cache: bundler
rvm:
- 2.2
- 2.3
- 2.4
- 2.5
- 2.6
- 2.7
env:
matrix:
- RAILS_VERSION='~> 4.2.0' SQLITE_VERSION='~> 1.3.6'
- RAILS_VERSION='~> 5.0.0' SQLITE_VERSION='~> 1.3.6'
- RAILS_VERSION='~> 5.1.0'
- RAILS_VERSION='~> 5.2.0'
- RAILS_VERSION='~> 6.0.0'
matrix:
exclude:
- rvm: 2.2
env: RAILS_VERSION='~> 6.0.0'
- rvm: 2.3
env: RAILS_VERSION='~> 6.0.0'
- rvm: 2.4
env: RAILS_VERSION='~> 6.0.0'
# Rails 4.2 uses BigDecimal.new, which Ruby 2.7 removed
- rvm: 2.7
env: RAILS_VERSION='~> 4.2.0' SQLITE_VERSION='~> 1.3.6'
|
Use full env for matrix exclude
|
Use full env for matrix exclude
|
YAML
|
mit
|
jhawthorn/discard,jhawthorn/discard
|
383c17e387b833a7a1d1db15a3c738505fb996b2
|
.travis.yml
|
.travis.yml
|
language: ruby
cache: bundler
dist: trusty
sudo: false
env:
- VAULT_VERSION=0.9.6
- VAULT_VERSION=0.8.3
- VAULT_VERSION=0.7.3
- VAULT_VERSION=0.6.5
- VAULT_VERSION=0.5.3
- VAULT_VERSION=0.4.1
- VAULT_VERSION=0.3.1
before_install:
- wget -O vault.zip -q https://releases.hashicorp.com/vault/${VAULT_VERSION}/vault_${VAULT_VERSION}_linux_amd64.zip
- unzip vault.zip
- mkdir -p ~/bin
- mv vault ~/bin
- export PATH="~/bin:$PATH"
branches:
only:
- master
rvm:
- "2.4"
- "2.5"
- "2.6"
before_script:
- bundle exec rake app:db:create
- bundle exec rake app:db:schema:load
- bundle exec rake app:db:test:prepare
- bundle exec appraisal install
script: bundle exec appraisal rake
|
language: ruby
cache: bundler
dist: trusty
sudo: false
env:
- VAULT_VERSION=0.9.6
- VAULT_VERSION=0.8.3
- VAULT_VERSION=0.7.3
- VAULT_VERSION=0.6.5
- VAULT_VERSION=0.5.3
- VAULT_VERSION=0.4.1
- VAULT_VERSION=0.3.1
before_install:
# Downgrade Bundler to 1.x
- gem uninstall -v '>= 2' -i $(rvm gemdir)@global -ax bundler || true
- gem install bundler -v '< 2'
- wget -O vault.zip -q https://releases.hashicorp.com/vault/${VAULT_VERSION}/vault_${VAULT_VERSION}_linux_amd64.zip
- unzip vault.zip
- mkdir -p ~/bin
- mv vault ~/bin
- export PATH="~/bin:$PATH"
branches:
only:
- master
rvm:
- "2.4"
- "2.5"
- "2.6"
before_script:
- bundle exec rake app:db:create
- bundle exec rake app:db:schema:load
- bundle exec rake app:db:test:prepare
- bundle exec appraisal install
script: bundle exec appraisal rake
|
Use Bundler 1.X on Travis
|
Use Bundler 1.X on Travis
|
YAML
|
mpl-2.0
|
hashicorp/vault-rails,hashicorp/vault-rails
|
29a75fe1a4d2bd4bb33eb297f7e0b9ed9e3802fd
|
.travis.yml
|
.travis.yml
|
language: ruby
rvm:
- 1.9.3
branches:
only:
- master
services:
- redis-server
notifications:
email: false
script:
- bundle exec rake test
before_script:
- psql -c 'create database ontohub_test;' -U postgres
- sudo apt-add-repository -y ppa:hets/hets
- sudo apt-add-repository -y "deb http://archive.canonical.com/ubuntu precise partner"
- sudo apt-get update
- sudo apt-get install hets-core subversion
- sudo hets -update
- bundle exec rake db:migrate || true
|
language: ruby
rvm:
- 1.9.3
branches:
only:
- master
- CRUD
services:
- redis-server
notifications:
email: false
script:
- bundle exec rake test
before_script:
- psql -c 'create database ontohub_test;' -U postgres
- sudo apt-add-repository -y ppa:hets/hets
- sudo apt-add-repository -y "deb http://archive.canonical.com/ubuntu precise partner"
- sudo apt-get update
- sudo apt-get install hets-core subversion
- sudo hets -update
- bundle exec rake db:migrate || true
|
Test branch CRUD on Travis CI, too.
|
Test branch CRUD on Travis CI, too.
|
YAML
|
agpl-3.0
|
ontohub/ontohub,ontohub/ontohub,ontohub/ontohub,ontohub/ontohub,ontohub/ontohub,ontohub/ontohub
|
0c218695366ef92082d91b98d12e74ec73c014f9
|
.travis.yml
|
.travis.yml
|
language: python
python:
- "2.7"
install: "pip install -r requirements.txt --use-mirrors"
before_script:
- npm install -g jshint
- cp settings.sample.py settings.py
- export=LINTREVIEW_SETTINGS=./settings.py
script: "nosetests"
notifications:
email:
on_failure: change
|
language: python
python:
- "2.7"
install: "pip install -r requirements.txt --use-mirrors"
before_script:
- npm install -g jshint
- cp settings.sample.py settings.py
env:
- LINTREVIEW_SETTINGS="./settings.py"
script: "nosetests"
notifications:
email:
on_failure: change
|
Update how environment vars are set.
|
Update how environment vars are set.
|
YAML
|
mit
|
zoidbergwill/lint-review,adrianmoisey/lint-review,markstory/lint-review,markstory/lint-review,adrianmoisey/lint-review,markstory/lint-review,zoidbergwill/lint-review,zoidbergwill/lint-review
|
5919d454d84739edf90d3478afaec9fc559e9f2f
|
50kafka.yml
|
50kafka.yml
|
apiVersion: apps/v1beta1
kind: StatefulSet
metadata:
name: kafka
namespace: kafka
spec:
serviceName: "broker"
replicas: 3
template:
metadata:
labels:
app: kafka
spec:
terminationGracePeriodSeconds: 10
containers:
- name: broker
image: solsson/kafka:0.11.0.0-rc2@sha256:c1316e0131f4ec83bc645ca2141e4fda94e0d28f4fb5f836e15e37a5e054bdf1
ports:
- containerPort: 9092
command:
- sh
- -c
- "./bin/kafka-server-start.sh config/server.properties --override log.retention.hours=-1 --override log.dirs=/opt/kafka/data/topics --override broker.id=${HOSTNAME##*-}"
volumeMounts:
- name: datadir
mountPath: /opt/kafka/data
volumeClaimTemplates:
- metadata:
name: datadir
spec:
accessModes: [ "ReadWriteOnce" ]
resources:
requests:
storage: 200Gi
|
apiVersion: apps/v1beta1
kind: StatefulSet
metadata:
name: kafka
namespace: kafka
spec:
serviceName: "broker"
replicas: 3
template:
metadata:
labels:
app: kafka
spec:
terminationGracePeriodSeconds: 10
containers:
- name: broker
image: solsson/kafka:0.11.0.0-rc2@sha256:c1316e0131f4ec83bc645ca2141e4fda94e0d28f4fb5f836e15e37a5e054bdf1
ports:
- containerPort: 9092
command:
- sh
- -c
- >
./bin/kafka-server-start.sh
config/server.properties
--override log.retention.hours=-1
--override log.dirs=/opt/kafka/data/topics
--override broker.id=${HOSTNAME##*-}
volumeMounts:
- name: datadir
mountPath: /opt/kafka/data
volumeClaimTemplates:
- metadata:
name: datadir
spec:
accessModes: [ "ReadWriteOnce" ]
resources:
requests:
storage: 200Gi
|
Support for generic image results in a verbose startup command, make it git friendly
|
Support for generic image results in a verbose startup command, make it git friendly
|
YAML
|
apache-2.0
|
Yolean/kubernetes-kafka
|
a9a922c3e1bc9f8b4905dcf9183e91081ed11301
|
data/transition-sites/defra_vla.yml
|
data/transition-sites/defra_vla.yml
|
---
site: defra_vla
whitehall_slug: animal-health-and-veterinary-laboratories-agency
title: Animal Health and Veterinary Laboratories Agency
redirection_date: 31st March 2014
homepage: https://www.gov.uk/government/organisations/animal-health-and-veterinary-laboratories-agency
tna_timestamp: 20130904121313
host: vla.defra.gov.uk
furl: www.gov.uk/ahvla
aliases:
- animalhealth.defra.gov.uk
- ahvla.defra.gov.uk
global: =301 https://www.gov.uk/government/organisations/animal-health-and-veterinary-laboratories-agency
|
---
site: defra_vla
whitehall_slug: animal-health-and-veterinary-laboratories-agency
title: Animal Health and Veterinary Laboratories Agency
redirection_date: 31st March 2014
homepage: https://www.gov.uk/government/organisations/animal-health-and-veterinary-laboratories-agency
tna_timestamp: 20130904121313
host: vla.defra.gov.uk
furl: www.gov.uk/ahvla
aliases:
- animalhealth.defra.gov.uk
- ahvla.defra.gov.uk
- www.svs.gov.uk
global: =301 https://www.gov.uk/government/organisations/animal-health-and-veterinary-laboratories-agency
|
Add alias to legacy VLA site
|
Add alias to legacy VLA site
|
YAML
|
mit
|
alphagov/transition-config,alphagov/transition-config
|
84f5fcd3929bf9d3d4c21ea35470e3c8c313b53f
|
defaults/main.yml
|
defaults/main.yml
|
---
# defaults file for threatstack
threatstack_url: https://app.threatstack.com
threatstack_v2_pkg_url: 'https://pkg.threatstack.com/v2'
threatstack_pkg_state: present
threatstack_pkg_validate: yes
# to set a version of the agent use threatstack-agent=X.Y.Z (Debian) or threatstack-agent-X.Y.Z (RedHat)
threatstack_pkg: threatstack-agent
threatstack_ruleset:
- 'Base Rule Set'
threatstack_configure_agent: true
threatstack_agent_extra_args:
threatstack_agent_config_args:
threatstack_agent_disable_service: false
|
---
# defaults file for threatstack
threatstack_url: https://app.threatstack.com
threatstack_v2_pkg_url: 'https://pkg.threatstack.com/v2'
threatstack_pkg_state: present
threatstack_pkg_validate: yes
# to set a version of the agent use threatstack-agent=X.Y.Z (Debian) or threatstack-agent-X.Y.Z (RedHat)
threatstack_pkg: threatstack-agent
threatstack_ruleset:
- 'Base Rule Set'
threatstack_hostname:
threatstack_configure_agent: true
threatstack_agent_extra_args:
threatstack_agent_config_args:
threatstack_agent_disable_service: false
|
Add back in hostname variable
|
Add back in hostname variable
|
YAML
|
mit
|
threatstack/threatstack-ansible
|
24faff8b034a8d57a9e67249dbf182c2c1c8ca9c
|
.github/workflows/build_msw_xml_libs.yml
|
.github/workflows/build_msw_xml_libs.yml
|
name: XML MSW binaries
on:
workflow_dispatch:
jobs:
build-xml-libs:
name: Build XML binaries
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v2
- name: Install Compiler
run: |
sudo apt-get update -qq
sudo apt-get install -qq --no-install-recommends g++-mingw-w64-i686 g++-mingw-w64-x86-64
- name: Build 32-bit Libraries
run: |
HOST=i686-w64-mingw32 ./scripts/ci/install.sh
- name: Build 64-bit Libraries
run: |
HOST=x86_64-w64-mingw32 ./scripts/ci/install.sh
- name: Upload
uses: actions/upload-artifact@v3
with:
name: xml-libs
path: /usr/local
|
name: XML MSW binaries
on:
workflow_dispatch:
jobs:
build-xml-libs:
name: Build XML binaries
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v2
- name: Install Compiler
run: |
sudo apt-get update -qq
sudo apt-get install -qq --no-install-recommends g++-mingw-w64-i686 g++-mingw-w64-x86-64
- name: Build 32-bit Libraries
run: |
HOST=i686-w64-mingw32 ./scripts/install_deps.sh
- name: Build 64-bit Libraries
run: |
HOST=x86_64-w64-mingw32 ./scripts/install_deps.sh
- name: Upload
uses: actions/upload-artifact@v3
with:
name: xml-libs
path: /usr/local
|
Fix wrong script name used in the workflow file
|
Fix wrong script name used in the workflow file
|
YAML
|
bsd-3-clause
|
vslavik/xmlwrapp,vslavik/xmlwrapp,vslavik/xmlwrapp
|
63b34ca9398c6ab0028db42878c79fb28c24a338
|
packages/hi/hipchat-hs.yaml
|
packages/hi/hipchat-hs.yaml
|
homepage: ''
changelog-type: ''
hash: 38b914e734ae78cc258a91ea4165a5354e7a0f98071848b0d964c9f8baa9e1c5
test-bench-deps: {}
maintainer: [email protected]
synopsis: Hipchat API bindings in Haskell
changelog: ''
basic-deps:
either: -any
bytestring: -any
split: -any
base: ! '>=4.8 && <4.9'
time: -any
servant-client: -any
text: -any
async: -any
servant: -any
lens: -any
string-conversions: -any
aeson: -any
all-versions:
- '0.0.2'
author: Oswyn Brent <[email protected]>
latest: '0.0.2'
description-type: haddock
description: Hipchat API bindings in Haskell
license-name: BSD3
|
homepage: ''
changelog-type: ''
hash: ea36bc392139d45e0ee519378a17d86dd12f57cc5c34d0e53b8fc660c4092078
test-bench-deps: {}
maintainer: [email protected]
synopsis: Hipchat API bindings in Haskell
changelog: ''
basic-deps:
http-client: -any
either: -any
bytestring: -any
aeson-casing: -any
split: -any
base: ! '>=4.8 && <4.9'
time: -any
servant-client: ! '>=0.6'
text: -any
async: -any
servant: ! '>=0.6'
lens: -any
network-uri: ! '>=2.6'
string-conversions: -any
aeson: ! '>=0.11'
all-versions:
- '0.0.2'
- '0.0.3'
author: Oswyn Brent <[email protected]>
latest: '0.0.3'
description-type: haddock
description: Hipchat API bindings in Haskell
license-name: BSD3
|
Update from Hackage at 2016-04-15T04:39:54+0000
|
Update from Hackage at 2016-04-15T04:39:54+0000
|
YAML
|
mit
|
commercialhaskell/all-cabal-metadata
|
be1a65b013da3779f0b66c06106369b8eba08732
|
_config.yml
|
_config.yml
|
# Site settings
title: Jekyll-Uno
description: 'Jekyll-Uno - a minimal, responsive theme for Jekyll'
url: 'http://joshgerdes.com/jekyll-uno'
baseurl: '/jekyll-uno/'
# google_analytics: 'UA-XXXXXX-X'
# disqus_shortname: 'your-disqus-name'
author:
name: 'Josh Gerdes'
email: [email protected]
twitter_username: joshgerdes
facebook_username: joshgerdes
github_username: joshgerdes
linkedin_username: joshgerdes
defaults:
-
scope:
path: ''
type: 'posts'
values:
layout: 'post'
# Build settings
destination: _site
paginate: 10
permalink: /:year/:title/
markdown: kramdown
highlighter: rouge
kramdown:
# use Github Flavored Markdown
input: GFM
# do not replace newlines by <br>s
hard_wrap: false
gems: ['jekyll-paginate']
exclude: ['README.md', 'Gemfile', 'Gemfile.lock', 'screenshot.png']
|
# Site settings
title: Mücahit İNEL
description: 'Hey! I will publish my resume and some blogs anymore'
url: 'https:minel.github.io'
baseurl: '/'
# google_analytics: 'UA-XXXXXX-X'
# disqus_shortname: 'your-disqus-name'
author:
name: 'Mücahit İNEL'
email: [email protected]
twitter_username: mucahit_inel
facebook_username: mucahit.in
github_username: minel
linkedin_username: mucahit_inel
defaults:
-
scope:
path: ''
type: 'posts'
values:
layout: 'post'
# Build settings
destination: _site
paginate: 10
permalink: /:year/:title/
markdown: kramdown
highlighter: rouge
kramdown:
# use Github Flavored Markdown
input: GFM
# do not replace newlines by <br>s
hard_wrap: false
gems: ['jekyll-paginate']
exclude: ['README.md', 'Gemfile', 'Gemfile.lock', 'screenshot.png']
|
Change baseurl and contact information
|
Change baseurl and contact information
|
YAML
|
mit
|
minel/minel.github.io,minel/minel.github.io,minel/minel.github.io
|
835e312d80bc5c6fc3c2931be66618cec91d0c78
|
_config.yml
|
_config.yml
|
name: Alykhan Kanji
exclude: [README.md, LICENSE.md, CNAME, Gemfile, Gemfile.lock, vendor]
permalink: blog/:categories/:year/:month/:day/:title/
defaults:
-
scope:
path: ""
type: "pages"
values:
layout: "default"
-
scope:
path: ""
type: "posts"
values:
layout: "post"
gems:
- jekyll_feed
|
name: Alykhan Kanji
exclude: [README.md, LICENSE.md, CNAME, Gemfile, Gemfile.lock, vendor]
permalink: blog/:categories/:year/:month/:day/:title/
defaults:
-
scope:
path: ""
type: "pages"
values:
layout: "default"
-
scope:
path: ""
type: "posts"
values:
layout: "post"
|
Revert "Set up automatic Atom feed generation"
|
Revert "Set up automatic Atom feed generation"
This reverts commit 35469e2231d2a9b92bf320b02c7878d2cbf9fa0b.
|
YAML
|
mit
|
alykhank/alykhank.github.io,alykhank/alykhank.github.io,alykhank/alykhank.github.io
|
57df04eee08f5f11c932c521a54570ea38375ffa
|
ansible/roles/mesos-slave/tasks/start.yml
|
ansible/roles/mesos-slave/tasks/start.yml
|
---
- name: Starting Mesos slave container
kolla_docker:
action: "start_container"
common_options: "{{ docker_common_options }}"
environment:
MESOS_HOSTNAME: "{{ inventory_hostname }}"
MESOS_IP: "{{ hostvars[inventory_hostname]['ansible_' + api_interface]['ipv4']['address'] }}"
MESOS_MASTER: "zk://{% for host in groups['zookeeper'] %}{{ hostvars[host]['ansible_' + hostvars[host]['api_interface']]['ipv4']['address'] }}:2181{% if not loop.last %},{% endif %}{% endfor %}/mesos"
MESOS_SWITCH_USER: "false"
MESOS_LOG_DIR: "/var/log/mesos"
MESOS_LOGGING_LEVEL: "INFO"
MESOS_DOCKER_REMOVE_DELAY: "{{ mesos_docker_remove_delay }}"
image: "{{ mesos_slave_image_full }}"
name: "mesos_slave"
privileged: True
volumes:
- /sys/fs/cgroup:/sys/fs/cgroup
- /var/run/docker.sock:/var/run/docker.sock
|
---
- name: Starting Mesos slave container
kolla_docker:
action: "start_container"
common_options: "{{ docker_common_options }}"
environment:
MESOS_HOSTNAME: "{{ inventory_hostname }}"
MESOS_IP: "{{ hostvars[inventory_hostname]['ansible_' + api_interface]['ipv4']['address'] }}"
MESOS_MASTER: "zk://{% for host in groups['zookeeper'] %}{{ hostvars[host]['ansible_' + hostvars[host]['api_interface']]['ipv4']['address'] }}:2181{% if not loop.last %},{% endif %}{% endfor %}/mesos"
MESOS_SWITCH_USER: "false"
MESOS_LOG_DIR: "/var/log/mesos"
MESOS_LOGGING_LEVEL: "INFO"
MESOS_DOCKER_REMOVE_DELAY: "{{ mesos_docker_remove_delay }}"
MESOS_ATTRIBUTES: "openstack_role:{% if 'controller' in group_names %}controller{% elif 'compute' in group_names %}compute{% endif %}"
MESOS_SYSTEMD_ENABLE_SUPPORT: "false"
image: "{{ mesos_slave_image_full }}"
name: "mesos_slave"
privileged: True
volumes:
- /sys/fs/cgroup:/sys/fs/cgroup
- /var/run/docker.sock:/var/run/docker.sock
|
Tag Mesos slaves with their OpenStack roles
|
Tag Mesos slaves with their OpenStack roles
To distinguish controller nodes from compute nodes, we need
to tag Mesos slaves. After that, we can use constraints in
Marathon to orchestrate applictions properly.
Change-Id: Ied5608b3aa34eca6ab2ca707bc4a633a23cab1fb
Partially-Implements: blueprint multinode
|
YAML
|
apache-2.0
|
asalkeld/kolla-mesos,openstack/kolla-mesos,openstack/kolla-mesos,openstack/kolla-mesos
|
3470ad5fd6a76ebc859ca55a3d44118d50322921
|
_config.yml
|
_config.yml
|
# Site settings
title: the padded cell
email: [email protected]
description: ""
baseurl: ""
url: "http://sanitarium.se"
twitter_username: gaqzi
github_username: gaqzi
facebook_username:
timezone: UTC
# Build settings
markdown: kramdown
highlighter: pygments
permalink: pretty
paginate: 5
exclude:
- less
- node_modules
- Gruntfile.js
- package.json
- README.md
- LICENSE
- css/bootstrap.css
- css/clean-blog.css
- js/bootstrap.js
- js/clean-blog.js
- js/jquery.js
|
# Site settings
title: the padded cell
email: [email protected]
description: ""
baseurl: ""
url: "http://sanitarium.se"
twitter_username: gaqzi
github_username: gaqzi
facebook_username:
timezone: UTC
# Build settings
markdown: kramdown
redcarpet:
extensions:
- footnotes
kramdown:
input: GFM
syntax_highlighter: rouge
hard_wrap: false
highlighter: pygments
permalink: pretty
paginate: 5
exclude:
- less
- node_modules
- Gruntfile.js
- package.json
- README.md
- LICENSE
- css/bootstrap.css
- css/clean-blog.css
- js/bootstrap.js
- js/clean-blog.js
- js/jquery.js
|
Add kramdown configuration for syntax hilighting
|
Add kramdown configuration for syntax hilighting
Now the markdown works just like on Github in regards to syntax
hilighting.
|
YAML
|
apache-2.0
|
gaqzi/sanitarium.se,gaqzi/sanitarium.se,gaqzi/sanitarium.se,gaqzi/sanitarium.se
|
defc27b802cdc45550f000be90b388c1f31654e9
|
_config.yml
|
_config.yml
|
# User Settings | Edit these according to your site
title: Statistical Computing for Bayesian Inference and Forecasting
email: [email protected]
author: Jacob Carey
description: A blog about Bayesian statistics with an emphasis on computational challenges
url: "http://jacobcvt12.gitub.io" # the base hostname & protocol for your site
baseurl: "" # the subpath of your site, e.g. /blog
github_username: jacobcvt12
permalink: /:year/:title.html
paginate: 4
paginate_path: "/page/:num"
comments: true
heading_font: Ubuntu
about_footer: true
profile_picture: /assets/img/jacobcarey.jpeg
google_analytics: true
tracking_id: UA-73774891-1
# Build settings | Do not edit anything below this
markdown: kramdown
gems:
- jekyll-paginate
- jekyll/scholar
sass:
style: compressed
sass_dir: _sass
# Scopes
defaults:
-
scope:
path: ""
type: "pages"
values:
layout: "page"
-
scope:
path: ""
type: "posts"
values:
layout: "post"
|
# Blog info
title: Statistical Computing for Bayesian Inference and Forecasting
email: [email protected]
author: Jacob Carey
description: A blog about Bayesian statistics with an emphasis on computational challenges
# blog and github path
url: "http://jacobcvt12.gitub.io" # the base hostname & protocol for your site
baseurl: "" # the subpath of your site, e.g. /blog
github_username: jacobcvt12
permalink: /:year/:title.html
# paginate settings
paginate: 4
paginate_path: "/page/:num"
# enable disqus comments
comments: true
# prettify header and footer
heading_font: Ubuntu
about_footer: true
profile_picture: /assets/img/jacobcarey.jpeg
# google analytics settings
google_analytics: true
tracking_id: UA-73774891-1
# markdown setting
markdown: kramdown
# which gyms to use
gems:
- jekyll-paginate
- jekyll/scholar
# sass settings
sass:
style: compressed
sass_dir: _sass
# Scopes
defaults:
-
scope:
path: ""
type: "pages"
values:
layout: "page"
-
scope:
path: ""
type: "posts"
values:
layout: "post"
|
Add comments to config yml
|
Add comments to config yml
|
YAML
|
mit
|
jacobcvt12/blog,jacobcvt12/blog,jacobcvt12/blog
|
61e05ce25276eb4640abd4c2ab8880c8457b528e
|
_config.yml
|
_config.yml
|
name: Ramblings of a Dev
description: Blogging about code stuff
meta_description: "Ramblings of a Dev, Blogging about code stuff"
aboutPage: false
markdown: redcarpet
highlighter: pygments
paginate: 20
baseurl: /
domain_name: 'http://mitchel.me'
google_analytics: 'UA-51952407-2'
disqus: true
disqus_shortname: 'mitchelme'
# Details for the RSS feed generator
url: 'http://mitchel.me'
author: 'Mitchel Cabuloy'
authorTwitter: 'mixxorz'
permalink: /:year/:title/
defaults:
-
scope:
path: "" # empty string for all files
type: pages
values:
layout: default
-
scope:
path: "" # empty string for all files
type: posts
values:
layout: post
-
scope:
path: ""
type: drafts
values:
layout: post
|
name: Ramblings of a Code Author
description: This blog used to be about programming
meta_description: "Ramblings of a Code Author, This blog used to be about programming"
aboutPage: false
markdown: redcarpet
highlighter: pygments
paginate: 20
baseurl: /
domain_name: 'http://mitchel.me'
google_analytics: 'UA-51952407-2'
disqus: true
disqus_shortname: 'mitchelme'
# Details for the RSS feed generator
url: 'http://mitchel.me'
author: 'Mitchel Cabuloy'
authorTwitter: 'mixxorz'
permalink: /:year/:title/
defaults:
-
scope:
path: "" # empty string for all files
type: pages
values:
layout: default
-
scope:
path: "" # empty string for all files
type: posts
values:
layout: post
-
scope:
path: ""
type: drafts
values:
layout: post
|
Update blog title and description
|
Update blog title and description
|
YAML
|
mit
|
mixxorz/mixxorz.github.io,mixxorz/mixxorz.github.io,mixxorz/mixxorz.github.io
|
5778a6141a349de72329745fd26db13dff5e9c09
|
chart/templates/storageclass.yaml
|
chart/templates/storageclass.yaml
|
kind: StorageClass
apiVersion: storage.k8s.io/v1
metadata:
name: longhorn
{{- if .Values.persistence.defaultClass }}
annotations:
storageclass.beta.kubernetes.io/is-default-class: "true"
{{- else }}
annotations:
storageclass.beta.kubernetes.io/is-default-class: "false"
{{- end }}
provisioner: rancher.io/longhorn
parameters:
numberOfReplicas: "{{ .Values.persistence.defaultClassReplicaCount }}"
staleReplicaTimeout: "30"
fromBackup: ""
baseImage: ""
|
kind: StorageClass
apiVersion: storage.k8s.io/v1
metadata:
name: longhorn
{{- if .Values.persistence.defaultClass }}
annotations:
storageclass.beta.kubernetes.io/is-default-class: "true"
{{- else }}
annotations:
storageclass.beta.kubernetes.io/is-default-class: "false"
{{- end }}
provisioner: driver.longhorn.io
parameters:
numberOfReplicas: "{{ .Values.persistence.defaultClassReplicaCount }}"
staleReplicaTimeout: "30"
fromBackup: ""
baseImage: ""
|
Update StorageClass with new driver name
|
helm: Update StorageClass with new driver name
Signed-off-by: Shuo Wu <[email protected]>
Signed-off-by: Sheng Yang <[email protected]>
|
YAML
|
apache-2.0
|
rancher/longhorn
|
139fe9b125256b9bf8d4b96729d38b52dc7a788d
|
roles/docker/tasks/main.yml
|
roles/docker/tasks/main.yml
|
---
- name: Download docker install script
get_url:
url: https://get.docker.com
dest: /tmp/docker-install.sh
mode: "a+x"
- name: Run install script
shell: /bin/sh /tmp/docker-install.sh
- name: Remove temporary file
file:
path: /tmp/docker-install.sh
state: absent
- name: Set Docker to auto-start
service:
name: docker
enabled: true
state: started
- name: add pi user to docker group
user:
name: pi
groups: docker
append: yes
become: true
- name: Install pip and deps
apt:
name:
- python3-pip
- libffi6
- libffi-dev
state: latest
- name: Install Docker Compose
command: /usr/bin/pip3 install docker-compose
|
---
- name: Populate service facts
service_facts:
- name: Download docker install script
get_url:
url: https://get.docker.com
dest: /tmp/docker-install.sh
mode: "a+x"
when: ansible_facts.services['docker.service'] is not defined
- name: Run install script
shell: /bin/sh /tmp/docker-install.sh
when: ansible_facts.services['docker.service'] is not defined
- name: Remove temporary file
file:
path: /tmp/docker-install.sh
state: absent
- name: Set Docker to auto-start
service:
name: docker
enabled: true
state: started
- name: add pi user to docker group
user:
name: pi
groups: docker
append: yes
become: true
- name: Install pip and deps
apt:
name:
- python3-pip
- libffi6
- libffi-dev
state: latest
- name: Install Docker Compose
pip:
name: docker-compose
state: present
executable: pip3
|
Install Docker and Docker Compose only If they are not already installed
|
Install Docker and Docker Compose only If they are not already installed
|
YAML
|
apache-2.0
|
tkurki/marinepi-provisioning
|
d761ea056aca15e3b707df6b4a55a5384328abff
|
.github/workflows/build-and-test-ALM-conda.yml
|
.github/workflows/build-and-test-ALM-conda.yml
|
name: Build and test ALM using Conda
on:
workflow_dispatch:
jobs:
build:
runs-on: ${{ matrix.os }}
strategy:
matrix:
os: [ubuntu-latest, macOS-latest]
# os: [ubuntu-latest, windows-latest, macOS-latest]
python-version: [2.7, 3.7, 3.8, 3.9]
name: OS ${{ matrix.os }} Python ${{ matrix.python-version }}
defaults:
run:
shell: bash -l {0}
steps:
- uses: actions/checkout@v3
- name: Setup conda
uses: conda-incubator/setup-miniconda@v2
with:
activate-environment: alm
python-version: ${{ matrix.python-version }}
auto-update-conda: true
channels: conda-forge
- run: |
conda info
conda list
conda config --show-sources
conda config --show
- name: Install conda libraries
run: conda install --yes numpy scipy h5py compilers spglib boost eigen cmake
- run: echo ${CC} ${CXX}
- name: Build ALM library
working-directory: ./python
run: python setup.py build
- name: Place ALM library
working-directory: ./python
run: pip install -e .
- name: Run test Si
working-directory: ./test
run: python Si_fitting.py
- name: Run test SiC
working-directory: ./test
run: python SiC_fitting.py
|
name: Conda build
on: [push]
# workflow_dispatch:
jobs:
build:
runs-on: ${{ matrix.os }}
strategy:
matrix:
os: [ubuntu-latest, macOS-latest]
# os: [ubuntu-latest, windows-latest, macOS-latest]
python-version: [2.7, 3.7, 3.8, 3.9]
name: OS ${{ matrix.os }} Python ${{ matrix.python-version }}
defaults:
run:
shell: bash -l {0}
steps:
- uses: actions/checkout@v3
- name: Setup conda
uses: conda-incubator/setup-miniconda@v2
with:
activate-environment: alm
python-version: ${{ matrix.python-version }}
auto-update-conda: true
channels: conda-forge
- run: |
conda info
conda list
conda config --show-sources
conda config --show
- name: Install conda libraries
run: conda install --yes numpy scipy h5py compilers spglib boost eigen cmake
- run: echo ${CC} ${CXX}
- name: Build ALM library
working-directory: ./python
run: python setup.py build
- name: Place ALM library
working-directory: ./python
run: pip install -e .
- name: Run test Si
working-directory: ./test
run: python Si_fitting.py
- name: Run test SiC
working-directory: ./test
run: python SiC_fitting.py
|
Rename workflow and change to [push] for a trigger
|
Rename workflow and change to [push] for a trigger
|
YAML
|
mit
|
ttadano/ALM,ttadano/ALM,ttadano/ALM,ttadano/ALM
|
f1db31a4a70c3b7882caaababf4c0ce432fec4ca
|
.github/workflows/linting-and-unit-testing.yml
|
.github/workflows/linting-and-unit-testing.yml
|
name: Node.js v14 CI
on: [push]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Use Node.js v14
uses: actions/setup-node@v1
with:
node-version: '14.x'
- run: npm ci
- run: npm run lint
- run: npm test
|
name: Node.js v14 CI
on: [push]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Use Node.js v14
uses: actions/setup-node@v2
with:
node-version: '14.x'
- run: npm ci
- run: npm run lint
- run: npm test
|
Update actions/setup-node action to v2
|
Update actions/setup-node action to v2
Signed-off-by: Renovate Bot <[email protected]>
|
YAML
|
mit
|
paazmaya/image-foldarizer
|
ad7af1a5ccd13bf7fb2e3984371f68b19df80031
|
.github/workflows/linting-and-unit-testing.yml
|
.github/workflows/linting-and-unit-testing.yml
|
name: Node.js v16 CI
on: [push]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Use Node.js v16
uses: actions/setup-node@v2
with:
node-version: '16.x'
- run: npm ci
- run: npm run lint
- run: npm test
|
name: Node.js v16 CI
on: [push]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Use Node.js v16
uses: actions/setup-node@v3
with:
node-version: '16.x'
- run: npm ci
- run: npm run lint
- run: npm test
|
Update actions/setup-node action to v3
|
Update actions/setup-node action to v3
Signed-off-by: Renovate Bot <[email protected]>
|
YAML
|
mit
|
paazmaya/tozan
|
6a64c8e7dbc1d3a6cbb444dbcdefbeb4632c01fa
|
zuul.d/project.yaml
|
zuul.d/project.yaml
|
---
# Copyright 2017, Rackspace US, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
- project:
name: openstack/openstack-ansible-plugins
check:
jobs:
- openstack-ansible-linters
- openstack-ansible-functional-centos-7
- openstack-ansible-functional-opensuse-423
- openstack-ansible-functional-ubuntu-xenial
- openstack-ansible-python3-ubuntu-xenial-nv
gate:
queue: openstack-ansible
jobs:
- openstack-ansible-linters
- openstack-ansible-functional-centos-7
- openstack-ansible-functional-opensuse-423
- openstack-ansible-functional-ubuntu-xenial
|
---
# Copyright 2017, Rackspace US, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
- project:
name: openstack/openstack-ansible-plugins
check:
jobs:
- openstack-ansible-linters
- openstack-ansible-functional-centos-7
- openstack-ansible-functional-opensuse-423
- openstack-ansible-functional-ubuntu-xenial
- openstack-ansible-python3-ubuntu-xenial-nv
gate:
jobs:
- openstack-ansible-linters
- openstack-ansible-functional-centos-7
- openstack-ansible-functional-opensuse-423
- openstack-ansible-functional-ubuntu-xenial
|
Remove unified queue from gate pipeline
|
Remove unified queue from gate pipeline
Initially the intent for adding this was to better test
any patches for roles together before they merge, but it
has had the unintended side-effect of causing patches to
take much longer to merge (because they all get lined up
in a single queue, rather than independent queues) and
a lot more infra resources are used (because a patch
that fails at the top of the queue will result in all
subsequent patches restarting all their tests).
As discussed in the channel, we'd prefer to revert back
to the previous independent queue method of testing. It
has served us well.
Change-Id: Ib4174ca5639b68871b1227ba41611872d6ba6a90
|
YAML
|
apache-2.0
|
openstack/openstack-ansible-plugins,openstack/openstack-ansible-plugins
|
2e741982813cb1aceffd1bc3473b267e839461dd
|
.circleci/config.yml
|
.circleci/config.yml
|
version: 2
jobs:
build:
docker:
- image: rikorose/gcc-cmake
working_directory: ~/MeadowsECS
steps:
- checkout
- run:
command: |
mkdir ~/build
cd ~/build
cmake -G"Unix Makefiles" -DBUILD_SHARED_LIBS=OFF -DBUILD_TEST=ON ~/MeadowsECS
- run:
command: |
cd ~/build
make -j2 MeadowsECS
- persist_to_workspace:
root: ~/
paths:
- build/*
test:
docker:
- image: rikorose/gcc-cmake
working_directory: ~/MeadowsECS
steps:
- checkout
- attach_workspace:
at: ~/
- run:
command: |
mkdir ~/test-results
cd ~/build
make -j2 MeadowsECSTest
./test/MeadowsECSTest -r junit > ~/test-results/MeadowsECSTest.xml
- store_test_results:
path: ~/test-results
workflows:
version: 2
build_with_dependencies:
jobs:
- build
- test:
requires:
- build
|
version: 2
jobs:
build:
docker:
- image: rikorose/gcc-cmake
working_directory: ~/MeadowsECS
steps:
- checkout
- run:
command: |
mkdir ~/build
cd ~/build
cmake -G"Unix Makefiles" -DBUILD_SHARED_LIBS=OFF -DBUILD_TEST=ON ~/MeadowsECS
- run:
command: |
cd ~/build
make -j2 MeadowsECS
- persist_to_workspace:
root: ~/
paths:
- build/*
test:
docker:
- image: rikorose/gcc-cmake
working_directory: ~/MeadowsECS
steps:
- checkout
- attach_workspace:
at: ~/
- run:
command: |
mkdir -p ~/test-results/unit_tests
cd ~/build
make -j2 MeadowsECSTest
./test/MeadowsECSTest -r junit > ~/test-results/unit_tests/MeadowsECSTest.xml
- store_test_results:
path: ~/test-results
workflows:
version: 2
build_with_dependencies:
jobs:
- build
- test:
requires:
- build
|
Fix test results metadata directory layout for CircleCI
|
Fix test results metadata directory layout for CircleCI
|
YAML
|
mit
|
Konijnendijk/Meadows-ECS
|
a670687998b6acbb8ee0c839f006325278cc9175
|
.circleci/config.yml
|
.circleci/config.yml
|
version: 2
jobs:
build:
docker:
- image: circleci/python:3
steps:
- checkout
- restore_cache:
key: dedupe_trees-{{ checksum "testing-requirements.txt" }}
- run:
name: Prepare Environment
command: |
python3 -m venv venv
. venv/bin/activate
pip install -r testing-requirements.txt
- save_cache:
key: dedupe_trees-{{ checksum "testing-requirements.txt" }}
paths:
- "venv"
- run:
name: Run Tests
command: |
. venv/bin/activate
mkdir -p test-reports
pytest --junitxml=test-reports/junit.xml --cov-config .coveragerc --cov=dedupe_trees --cov=bin.dedupe_trees
codecov
- store_test_results:
path: test-reports
|
version: 2
jobs:
build:
docker:
- image: circleci/python:3
steps:
- checkout
- restore_cache:
key: dedupe_trees-{{ checksum "testing-requirements.txt" }}
- run:
name: Prepare Environment
command: |
python3 -m venv venv
. venv/bin/activate
pip install -r testing-requirements.txt
- save_cache:
key: dedupe_trees-{{ checksum "testing-requirements.txt" }}
paths:
- "venv"
- run:
name: Run Tests
command: |
. venv/bin/activate
mkdir -p test-reports
pytest --junitxml=test-reports/junit.xml --cov-config .coveragerc --cov=dedupe_trees
codecov
- store_test_results:
path: test-reports
|
Remove old module name from pytest
|
Remove old module name from pytest
|
YAML
|
mit
|
davidmreed/dedupe.py
|
e8eb2b00ac524d02d1f81a9d8a9c633e89e6e5c6
|
.circleci/config.yml
|
.circleci/config.yml
|
version: 2
jobs:
py3test:
working_directory: ~/megnet
docker:
- image: materialsvirtuallab/circle-ci-pmg-py3:0.0.2
steps:
- checkout
- run:
command: |
export PATH=$HOME/miniconda3/bin:$PATH
conda config --set always_yes yes --set changeps1 no
conda update -q conda
conda info -a
conda create -q -n test-environment python=3.7 numpy scipy matplotlib sympy cython
source activate test-environment
conda update --quiet numpy scipy matplotlib sympy cython
pip install --quiet --ignore-installed -r requirements.txt -r requirements-ci.txt
- run:
command: |
export PATH=$HOME/miniconda3/bin:$PATH
source activate test-environment
pip install --quiet -e .
pytest --cov=megnet --cov-report html:coverage_reports --quiet megnet
pycodestyle megnet
coveralls
no_output_timeout: 3600
- store_artifacts:
path: coverage_reports/
destination: tr1
- store_test_results:
path: coverage_reports/
workflows:
version: 2
build_and_test:
jobs:
- py3test
|
version: 2
jobs:
py3test:
working_directory: ~/megnet
docker:
- image: materialsvirtuallab/circle-ci-pmg-py3:0.0.2
steps:
- checkout
- run:
command: |
export PATH=$HOME/miniconda3/bin:$PATH
conda config --set always_yes yes --set changeps1 no
conda update -q conda
conda info -a
conda create -q -n test-environment python=3.7 numpy scipy matplotlib sympy cython tensorflow keras
source activate test-environment
conda update --quiet numpy scipy matplotlib sympy cython tensorflow keras
pip install --quiet --ignore-installed -r requirements.txt -r requirements-ci.txt
- run:
command: |
export PATH=$HOME/miniconda3/bin:$PATH
source activate test-environment
pip install --quiet -e .
pytest --cov=megnet --cov-report html:coverage_reports --quiet megnet
pycodestyle megnet
coveralls
no_output_timeout: 3600
- store_artifacts:
path: coverage_reports/
destination: tr1
- store_test_results:
path: coverage_reports/
workflows:
version: 2
build_and_test:
jobs:
- py3test
|
Use conda to install tensorflow.
|
Use conda to install tensorflow.
|
YAML
|
bsd-3-clause
|
materialsvirtuallab/megnet,materialsvirtuallab/megnet,materialsvirtuallab/megnet,materialsvirtuallab/megnet,materialsvirtuallab/megnet
|
822566fdb45529b9957937308ebe2e08e71692b8
|
.circleci/config.yml
|
.circleci/config.yml
|
# Python CircleCI 2.0 configuration file
#
# Check https://circleci.com/docs/2.0/language-python/ for more details
#
version: 2
jobs:
build:
docker:
- image: circleci/python:3.6.5@sha256:0e53e4bb368a1ce85dd381ca5b0c61a3bbbd9e6de24bab68e6faf3cbed3d5415
working_directory: ~/repo
steps:
- checkout
- restore_cache:
keys:
- v1-dependencies-{{ checksum "requirements.txt" }}-{{ checksum "dev-requirements.txt" }}
# fallback to using the latest cache if no exact match is found
- v1-dependencies-
- run:
name: install dependencies
command: |
python3 -m venv venv
. venv/bin/activate
python setup.py dev_persistent_pip_sync
- save_cache:
paths:
- ./venv
key: v1-dependencies-{{ checksum "requirements.txt" }}-{{ checksum "dev-requirements.txt" }}
- run:
name: run tests
command: |
. venv/bin/activate
flake8 bamboo_crawler tests
export AWS_ACCESS_KEY_ID=''
export AWS_SECRET_ACCESS_KEY=''
python setup.py test
- store_artifacts:
path: test-reports
destination: test-reports
|
# Python CircleCI 2.0 configuration file
#
# Check https://circleci.com/docs/2.0/language-python/ for more details
#
version: 2
jobs:
build:
docker:
- image: circleci/python:3.6.5@sha256:55a914663dd53cdf25adfb6cab1145530977dae44c71321dd23de08f5da94ef1
working_directory: ~/repo
steps:
- checkout
- restore_cache:
keys:
- v1-dependencies-{{ checksum "requirements.txt" }}-{{ checksum "dev-requirements.txt" }}
# fallback to using the latest cache if no exact match is found
- v1-dependencies-
- run:
name: install dependencies
command: |
python3 -m venv venv
. venv/bin/activate
python setup.py dev_persistent_pip_sync
- save_cache:
paths:
- ./venv
key: v1-dependencies-{{ checksum "requirements.txt" }}-{{ checksum "dev-requirements.txt" }}
- run:
name: run tests
command: |
. venv/bin/activate
flake8 bamboo_crawler tests
export AWS_ACCESS_KEY_ID=''
export AWS_SECRET_ACCESS_KEY=''
python setup.py test
- store_artifacts:
path: test-reports
destination: test-reports
|
Update circleci/python:3.6.5 Docker digest to 55a914
|
Update circleci/python:3.6.5 Docker digest to 55a914
|
YAML
|
bsd-3-clause
|
kitsuyui/bamboo-crawler,kitsuyui/bamboo-crawler,kitsuyui/bamboo-crawler
|
9531edc573660392bf200ede13ca1461a739f19c
|
.circleci/config.yml
|
.circleci/config.yml
|
version: 2.1
jobs:
ubuntu1604-python2:
docker:
- image: stbtester/circleci:ubuntu1604-python2
environment:
PYTHON: /usr/bin/python2.7
steps:
- checkout
- test
ubuntu1804-python2:
docker:
- image: stbtester/circleci:ubuntu1804
environment:
PYTHON: /usr/bin/python2.7
steps:
- checkout
- test
ubuntu1804-python3:
docker:
- image: stbtester/circleci:ubuntu1804
environment:
PYTHON: /usr/bin/python3.6
steps:
- checkout
- test
commands:
test:
steps:
- run:
name: make check
environment:
LANG: en_GB.UTF-8
SHELL: /bin/bash
command: |
tesseract --version
pylint --version
make enable_stbt_camera=no enable_virtual_stb=yes check
workflows:
test_all:
jobs:
- ubuntu1604-python2
|
version: 2.1
jobs:
ubuntu1604-python2:
docker:
- image: stbtester/circleci:ubuntu1604-python2
environment:
PYTHON: /usr/bin/python2.7
steps:
- checkout
- test
ubuntu1804-python2:
docker:
- image: stbtester/circleci:ubuntu1804
environment:
PYTHON: /usr/bin/python2.7
steps:
- checkout
- test
ubuntu1804-python3:
docker:
- image: stbtester/circleci:ubuntu1804
environment:
PYTHON: /usr/bin/python3.6
steps:
- checkout
- test
commands:
test:
steps:
- run:
name: make check
environment:
LANG: en_GB.UTF-8
SHELL: /bin/bash
TERM: xterm
command: |
tesseract --version
pylint --version
make enable_stbt_camera=no enable_virtual_stb=yes check
workflows:
test_all:
jobs:
- ubuntu1604-python2
|
Set $TERM to fix tput & curses errors
|
circleci: Set $TERM to fix tput & curses errors
This should fix
`test_stbt_control_as_stbt_record_control_recorder__default_keymap`
which needs curses.
It should also silence warnings from `tput` which is used by
`run-tests.sh` to output pretty colours.
|
YAML
|
lgpl-2.1
|
stb-tester/stb-tester,stb-tester/stb-tester,stb-tester/stb-tester,stb-tester/stb-tester
|
59e1cc0b0549ac3bafed93012355b7f835531a70
|
appspec.yml
|
appspec.yml
|
version: 0.0
os: linux
files:
- source: /index.html
destination: /var/www/html/
hooks:
BeforeInstall:
- location: scripts/install_dependencies
timeout: 300
runas: root
- location: scripts/start_server
timeout: 300
runas: root
- location: scripts/elb/register_with_elb.sh
timeout: 300
runas: root
ApplicationStop:
- location: scripts/elb/deregister_from_elb.sh
timeout: 300
runas: root
- location: scripts/stop_server
timeout: 300
runas: root
|
version: 0.0
os: linux
files:
- source: /index.html
destination: /var/www/html/
hooks:
BeforeInstall:
- location: scripts/elb/deregister_from_elb.sh
timeout: 300
runas: root
- location: scripts/install_dependencies
timeout: 300
runas: root
- location: scripts/start_server
timeout: 300
runas: root
- location: scripts/elb/register_with_elb.sh
timeout: 300
runas: root
ApplicationStop:
- location: scripts/stop_server
timeout: 300
runas: root
|
Test with deregister under BeforeInstall.
|
Test with deregister under BeforeInstall.
|
YAML
|
apache-2.0
|
Jmcfar/codedeploy-sample-app,Jmcfar/codedeploy-sample-app
|
3d15169078d26fb0f4f291003372b57c1dddd19f
|
config/locales/multiple_upload.pl.yml
|
config/locales/multiple_upload.pl.yml
|
en:
admin:
actions:
dropzone:
dictDefaultMessage: "Upuść pliki tutaj, żeby przesłać"
dictFallbackMessage: "Twoja przeglądarka nie wspiera przesyłania plików metodą drag'n'drop."
dictFallbackText: "Proszę skorzystać z zastępczego poniżej formularza, aby przesłać pliki tak jak w dawnych czasach."
dictFileTooBig: "Plik jest zbyt duży ({{filesize}}MiB). Max rozmiar pliku: {{maxFilesize}}MiB."
dictInvalidFileType: "Nie można przesłać plików tego typu."
dictResponseError: "Serwer zgłosił {{statusCode}} kod."
dictCancelUpload: "Anuluj przesyłanie"
dictCancelUploadConfirmation: "Czy na pewno chcesz anulować przesyłanie?"
dictRemoveFile: "Usuń plik"
dictMaxFilesExceeded: "Nie możesz przesłać więcej plików."
title: "Prześlij kilka plików"
menu: "Prześlij kilka plików dla %{model_label} '%{object_label}'"
breadcrumb: "Prześlij kilka plików"
link: "Prześlij kilka plików"
bulk_link: "Zaznaczono przesyłanie kilku plików %{model_label_plural}"
done: "Kilka plików przesłano"
|
en:
admin:
actions:
dropzone:
dictDefaultMessage: "Upuść pliki tutaj, żeby przesłać"
dictFallbackMessage: "Twoja przeglądarka nie wspiera przesyłania plików metodą drag'n'drop."
dictFallbackText: "Proszę skorzystać z zastępczego poniżej formularza, aby przesłać pliki tak jak w dawnych czasach."
dictFileTooBig: "Plik jest zbyt duży ({{filesize}}MiB). Max rozmiar pliku: {{maxFilesize}}MiB."
dictInvalidFileType: "Nie można przesłać plików tego typu."
dictResponseError: "Serwer zgłosił {{statusCode}} kod."
dictCancelUpload: "Anuluj przesyłanie"
dictCancelUploadConfirmation: "Czy na pewno chcesz anulować przesyłanie?"
dictRemoveFile: "Usuń plik"
dictMaxFilesExceeded: "Nie możesz przesłać więcej plików."
title: "Prześlij kilka plików"
menu: "Prześlij kilka plików dla %{model_label} '%{object_label}'"
breadcrumb: "Prześlij kilka plików"
link: "Prześlij kilka plików"
bulk_link: "Zaznaczono przesyłanie kilku plików %{model_label_plural}"
done: "Kilka plików przesłano"
|
Remove blank line at the end of a file
|
Remove blank line at the end of a file
|
YAML
|
mit
|
luizpicolo/rails_admin_dropzone,luizpicolo/rails_admin_dropzone,luizpicolo/rails_admin_dropzone
|
4850e792732c3b04f7fbeee13ada78a0d4ae230f
|
.gitlab-ci.yml
|
.gitlab-ci.yml
|
# .gitlab-ci.yml
build:
#image: ubuntu:18.04
#image: xed-testing-container
image: amr-registry-pre.caas.intel.com/xed/xed-testing-container
stage: build
script:
- python3 ci-internal.py
|
# .gitlab-ci.yml
variables:
PACKAGE_NAME: xed-common
build:
#image: ubuntu:18.04
#image: xed-testing-container
image: amr-registry-pre.caas.intel.com/xed/xed-testing-container
stage: build
script:
- python3 ci-internal.py
build-conan:
image: amr-registry.caas.intel.com/syssim/teamcity-agent:2020.1.5-21ww05
stage: build
script:
- virtualenv --python="$(which python3)" ~/.syssim-virtualenv
- source ~/.syssim-virtualenv/bin/activate
- pip install conan pylint astroid yapf
- conan config install https://gitlab.devtools.intel.com/syssim/conan-config.git
- |-
if [[ $CI_COMMIT_REF_NAME == main && $CI_COMMIT_TAG == v* ]]; then
PACKAGE_REF=$PACKAGE_NAME/${CI_COMMIT_TAG#v*}@xed/stable
else
PACKAGE_REF=$PACKAGE_NAME/$CI_COMMIT_SHA@xed/ci
fi
conan create . $PACKAGE_REF --build=missing --profile=gcc9-native
if [[ $CI_COMMIT_REF_NAME == main ]]; then
conan user -r syssim-public "$CONAN_USERNAME" -p "$CONAN_PASSWORD"
conan upload $PACKAGE_REF -r syssim-public --force
LATEST_REF=$PACKAGE_NAME/latest@xed/ci
conan alias $LATEST_REF $PACKAGE_REF
conan upload $LATEST_REF -r syssim-public --force
fi
|
Add GitLab CI trigger for Conan package
|
Add GitLab CI trigger for Conan package
- Build Conan package as part of CI
- Upload to Artifactory for commits to "main":
- use "PACKAGE/VERSION@xed/stable" reference when tagged with "vVERSION"
- use "PACKAGE/SHA@xed/ci" reference for unversioned commits
- Update alias reference "PACKAGE/latest@xed/ci"
|
YAML
|
apache-2.0
|
intelxed/xed,intelxed/xed,intelxed/xed,intelxed/xed
|
9c85078c067652f7820603d6930d109260d82d9c
|
.gitlab-ci.yml
|
.gitlab-ci.yml
|
cache:
paths:
- vendor/
before_script:
- curl -sS https://getcomposer.org/installer | php
- php composer.phar install --prefer-dist --classmap-authoritative --no-interaction --no-progress
test:7.0:
image: php:7.0-alpine
script:
- php composer.phar test
test:7.1:
image: php:7.1-alpine
script:
- php composer.phar test
test:7.2:
image: php:7.2-rc-alpine
allow_failure: true
script:
- php composer.phar test
|
cache:
paths:
- vendor/
before_script:
- curl -sS https://getcomposer.org/installer | php
- php composer.phar install --prefer-dist --classmap-authoritative --no-interaction --no-progress
test:7.1:
image: php:7.1-alpine
script:
- php composer.phar test
test:7.2:
image: php:7.2-rc-alpine
allow_failure: true
script:
- php composer.phar test
|
Drop support for PHP 7.0
|
Drop support for PHP 7.0
|
YAML
|
mit
|
ThePixelDeveloper/Sitemap-v2,ThePixelDeveloper/Sitemap
|
8ef818dd96b795db58854235ec3a9d566c46436c
|
.gitlab-ci.yml
|
.gitlab-ci.yml
|
---
image: gebish/ci:v4
variables:
GRADLE_USER_HOME: .gradle-home
check:
stage: build
cache:
key: "$CI_JOB_NAME"
paths:
- .gradle-home
artifacts:
when: always
expire_in: 4 weeks
paths:
- build/reports
script:
- Xvfb :99 -screen 1 1280x1024x16 -nolisten tcp -fbdir /var/run > /dev/null 2>&1 &
- export DISPLAY=:99
- export FIREFOX_VERSION=53.0.3
- ./gradlew --no-daemon check
|
---
image: gebish/ci:v5
variables:
GRADLE_USER_HOME: .gradle-home
check:
stage: build
cache:
key: "$CI_JOB_NAME"
paths:
- .gradle-home
artifacts:
when: always
expire_in: 4 weeks
paths:
- build/reports
script:
- Xvfb :99 -screen 1 1280x1024x16 -nolisten tcp -fbdir /var/run > /dev/null 2>&1 &
- export DISPLAY=:99
- export FIREFOX_VERSION=53.0.3
- ./gradlew --no-daemon check
|
Update docker image used by CI so that a newer version of Chrome is used to run the tests
|
Update docker image used by CI so that a newer version of Chrome is used to run the tests
|
YAML
|
apache-2.0
|
geb/geb-example-gradle
|
0b50c78393d4bea0253aeacda8787547daca3d6b
|
.gitlab-ci.yml
|
.gitlab-ci.yml
|
# .gitlab-ci.yml -- configures gitlab.com CI system
#
# The MIT Licence (MIT)
#
# Copyright (c) 2016 Samuel Sirois (sds) <[email protected]>
#
# Permission is hereby granted, free of charge, to any person obtaining a
# copy of this software and associated documentation files (the "Software"),
# to deal in the Software without restriction, including without limitation
# the rights to use, copy, modify, merge, publish, distribute, sublicense,
# and/or sell copies of the Software, and to permit persons to whom the
# Software is furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included
# in all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
# NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
# DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
# OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR
# THE USE OR OTHER DEALINGS IN THE SOFTWARE.
before_script:
- apt-get update -qq && apt-get install -y -qq shunit2 xsltproc
stages:
- test
- deploy
check:
stage: test
script:
- make check
build-example-artifacts:
stage: deploy
script:
- make build-example-artifacts
artifacts:
paths:
- artifacts/
|
# .gitlab-ci.yml -- configures gitlab.com CI system
#
# The MIT Licence (MIT)
#
# Copyright (c) 2016 Samuel Sirois (sds) <[email protected]>
#
# Permission is hereby granted, free of charge, to any person obtaining a
# copy of this software and associated documentation files (the "Software"),
# to deal in the Software without restriction, including without limitation
# the rights to use, copy, modify, merge, publish, distribute, sublicense,
# and/or sell copies of the Software, and to permit persons to whom the
# Software is furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included
# in all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
# NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
# DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
# OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR
# THE USE OR OTHER DEALINGS IN THE SOFTWARE.
before_script:
- apt-get update -qq && apt-get install -y -qq shunit2 xsltproc poppler-utils
stages:
- test
- deploy
check:
stage: test
script:
- make check
build-example-artifacts:
stage: deploy
script:
- make build-example-artifacts
artifacts:
paths:
- artifacts/
|
Add pdf tools (we need pdftohtml)
|
Add pdf tools (we need pdftohtml)
|
YAML
|
mit
|
ssirois/hephaestus
|
0293bee8aa6c7a6d2150d03df713bd1ac7eb4e7b
|
.swiftlint.yml
|
.swiftlint.yml
|
# Configuration for SwiftLint (https://github.com/realm/SwiftLint)
excluded:
- Carthage
opt_in_rules:
- empty_count
- missing_docs
disabled_rules:
- comma
- cyclomatic_complexity
- function_body_length
- line_length
- opening_brace
- valid_docs
|
# Configuration for SwiftLint (https://github.com/realm/SwiftLint)
excluded:
- Carthage
opt_in_rules:
- empty_count
- missing_docs
- vertical_whitespace
disabled_rules:
- comma
- cyclomatic_complexity
- function_body_length
- line_length
- opening_brace
- valid_docs
|
Enable the vertical_whitespace SwiftLint rule
|
Enable the vertical_whitespace SwiftLint rule
|
YAML
|
mit
|
mattrubin/onetimepassword,mattrubin/onetimepassword
|
0141cf1b5cfb9134b8d90081e4beb5c8c519da37
|
.swiftlint.yml
|
.swiftlint.yml
|
disabled_rules:
- force_cast
- line_length
- variable_name_min_length
- trailing_whitespace
- nesting
- opening_brace
excluded:
- Sources/Container.Arguments.swift
- Sources/SynchronizedResolver.Arguments.swift
- Sources/Resolvable.swift
- Tests
- Sample-iOS.playground
- Carthage
|
disabled_rules:
- force_cast
- line_length
- variable_name_min_length
- trailing_whitespace
- nesting
- opening_brace
- valid_docs
excluded:
- Sources/Container.Arguments.swift
- Sources/SynchronizedResolver.Arguments.swift
- Sources/Resolvable.swift
- Tests
- Sample-iOS.playground
- Carthage
|
Exclude 'valid_docs' rule because still Swiftlint v0.6.0 has problems on the rule.
|
Exclude 'valid_docs' rule because still Swiftlint v0.6.0 has problems on the rule.
|
YAML
|
mit
|
yoichitgy/Swinject,yoichitgy/Swinject,Swinject/Swinject,yoichitgy/Swinject,Swinject/Swinject,Swinject/Swinject,yoichitgy/Swinject,Swinject/Swinject
|
28854e7d0b4ca95e9c31fac71801a2b5b67c3d1e
|
boxfile.yml
|
boxfile.yml
|
run.config:
engine: golang
engine.config:
runtime: go-1.8
package: 'github.com/hasit/healthyrepo'
fetch: 'go get'
build: 'go build'
web.api:
start: healthyrepo
|
run.config:
engine: golang
engine.config:
runtime: go-1.8
package: 'github.com/hasit/healthyrepo'
fetch: 'go get -u'
build: 'go build'
web.api:
start: healthyrepo
|
Update get fetch hook to 'go get -u'
|
Update get fetch hook to 'go get -u'
|
YAML
|
mit
|
hasit/healthyrepo,hasit/healthyrepo,hasit/healthyrepo
|
144fa82196f65c4e9a9224e2a0bcb2c1630cd6bf
|
data/building-hours/1-3-stav.yaml
|
data/building-hours/1-3-stav.yaml
|
name: Stav Hall
image: stav
category: Food
schedule:
- title: Breakfast
hours:
- {days: [Mo, Tu, We, Th, Fr, Sa], from: '7:00am', to: '9:45am'}
- {days: [Su], from: '8:30am', to: '10:15am'}
- title: Lunch
hours:
# - {days: [Mo, Tu, We, Th, Fr], from: '10:30am', to: '2:00pm'}
# - {days: [Sa, Su], from: '11:00am', to: '1:30pm'}
- {days: [Mo, Tu, We, Th, Fr], from: '10:45am', to: '1:30pm'}
- {days: [Sa, Su], from: '11:00am', to: '1:30pm'}
- title: Dinner
hours:
# - {days: [Mo, Tu, We, Th, Fr, Sa, Su], from: '4:30pm', to: '7:30pm'}
- {days: [Mo, Tu, We, Th, Fr, Sa, Su], from: '4:30pm', to: '7:00pm'}
breakSchedule:
fall: []
thanksgiving: []
winter: []
interim: []
spring: []
easter: []
summer: []
|
name: Stav Hall
image: stav
category: Food
schedule:
- title: Breakfast
# hours:
# - {days: [Mo, Tu, We, Th, Fr, Sa], from: '7:00am', to: '9:45am'}
# - {days: [Su], from: '8:30am', to: '10:15am'}
hours: []
- title: Lunch
notes: NO MEAL PLANS - department charges, cash, credit, Ole or Flex dollars only
hours:
# - {days: [Mo, Tu, We, Th, Fr], from: '10:30am', to: '2:00pm'}
# - {days: [Sa, Su], from: '11:00am', to: '1:30pm'}
- {days: [Mo, Tu, We, Th, Fr], from: '12:00pm', to: '1:00pm'}
- title: Dinner
notes: NO MEAL PLANS - department charges, cash, credit, Ole or Flex dollars only
hours:
# - {days: [Mo, Tu, We, Th, Fr, Sa, Su], from: '4:30pm', to: '7:30pm'}
- {days: [Mo, Tu, We, Th, Fr], from: '5:00pm', to: '6:00pm'}
breakSchedule:
fall: []
thanksgiving: []
winter: []
interim: []
spring: []
easter: []
summer: []
|
Update stav hours for interim break
|
Update stav hours for interim break
|
YAML
|
agpl-3.0
|
StoDevX/AAO-React-Native,carls-app/carls,carls-app/carls,carls-app/carls,StoDevX/AAO-React-Native,carls-app/carls,StoDevX/AAO-React-Native,StoDevX/AAO-React-Native,StoDevX/AAO-React-Native,StoDevX/AAO-React-Native,StoDevX/AAO-React-Native,carls-app/carls,carls-app/carls,StoDevX/AAO-React-Native,StoDevX/AAO-React-Native,carls-app/carls
|
f7bed64314392d6823f5b7bdb75501a6c06b88cf
|
handlers/main.yml
|
handlers/main.yml
|
---
#
# Copyright (c) 2014 Davide Guerri <[email protected]>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
- name: Start keystone
service: name=keystone state=started
- name: Reload keystone
service: name=keystone state=reloaded
- name: Restart keystone
service: name=keystone state=restarted
- name: Sync keystone db
shell: keystone-manage db_sync
sudo: yes
sudo_user: keystone
notify:
- Restart keystone
|
---
#
# Copyright (c) 2014 Davide Guerri <[email protected]>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
- name: Start keystone
service: name=keystone state=started
- name: Reload keystone
service: name=keystone state=reloaded
- name: Restart keystone
service: name=keystone state=restarted
- name: Sync keystone db
command: keystone-manage db_sync
sudo: yes
sudo_user: keystone
notify:
- Restart keystone
|
Use command instead of shell
|
Use command instead of shell
|
YAML
|
apache-2.0
|
dguerri/openstack-keystone,openstack-ansible-galaxy/openstack-keystone,openstack-ansible-galaxy/openstack-keystone
|
0617e681a56e2dae2e998785a2f81d26ce524a68
|
codecov.yml
|
codecov.yml
|
ignore:
- "bindata*.go"
- "rice-box.go"
|
ignore:
# Generated code
- "bindata*.go"
- "rice-box.go"
# Documentation
- "cmd/lorem-ipsum/*"
|
Exclude doc generation, and add comments
|
Exclude doc generation, and add comments
|
YAML
|
mit
|
jmcfarlane/Notable,jmcfarlane/Notable,jmcfarlane/Notable,jmcfarlane/Notable
|
6f66e0f2a72ce72bd828fe4a36fcba770e6303fb
|
sdk/identity/ci.yml
|
sdk/identity/ci.yml
|
# NOTE: Please refer to https://aka.ms/azsdk/engsys/ci-yaml before editing this file.
trigger:
branches:
include:
- master
- hotfix/*
- release/*
- restapi*
paths:
include:
- sdk/identity/
- sdk/core/
pr:
branches:
include:
- master
- feature/*
- hotfix/*
- release/*
- restapi*
paths:
include:
- sdk/identity/
- sdk/core/
extends:
template: ../../eng/pipelines/templates/stages/archetype-sdk-client.yml
parameters:
ServiceDirectory: identity
AdditionalTestMatrix:
Linux_Python35_msal:
OSVmImage: 'ubuntu-18.04'
PythonVersion: '3.5'
CoverageArg: ''
RunForPR: false
InjectedPackages: 'git+https://github.com/AzureAD/microsoft-authentication-library-for-python@dev'
Artifacts:
- name: azure_identity
safeName: azureidentity
|
# NOTE: Please refer to https://aka.ms/azsdk/engsys/ci-yaml before editing this file.
trigger:
branches:
include:
- master
- hotfix/*
- release/*
- restapi*
paths:
include:
- sdk/identity/
- sdk/core/
pr:
branches:
include:
- master
- feature/*
- hotfix/*
- release/*
- restapi*
paths:
include:
- sdk/identity/
- sdk/core/
extends:
template: ../../eng/pipelines/templates/stages/archetype-sdk-client.yml
parameters:
ServiceDirectory: identity
AdditionalTestMatrix:
Linux_Python35_msal:
Pool: Azure Pipelines
OSVmImage: 'ubuntu-18.04'
PythonVersion: '3.5'
CoverageArg: ''
RunForPR: false
InjectedPackages: 'git+https://github.com/AzureAD/microsoft-authentication-library-for-python@dev'
Artifacts:
- name: azure_identity
safeName: azureidentity
|
Add Pool name to the additional matrix entry
|
Add Pool name to the additional matrix entry
Needed to add the Pool name to the additional matrix entry.
|
YAML
|
mit
|
Azure/azure-sdk-for-python,Azure/azure-sdk-for-python,Azure/azure-sdk-for-python,Azure/azure-sdk-for-python
|
8660e290a9e36103dcc30b81cedc37815c17eeaa
|
GitVersion.yml
|
GitVersion.yml
|
assembly-versioning-format: '{major}.{minor}.{WeightedPreReleaseNumber}'
assembly-informational-format: '{major}.{minor}{PreReleaseTagWithDash}--{CommitDate}--{ShortSha}'
mode: ContinuousDeployment
branches:
master:
regex: ^master
tag: beta
pre-release-weight: 30000 # 0 after stable release, 15000 before alpha release, 30000 before beta release, 50000 before stable release
|
assembly-versioning-format: '{major}.{minor}.{WeightedPreReleaseNumber}'
assembly-informational-format: '{major}.{minor}{PreReleaseTagWithDash}--{CommitDate}--{ShortSha}'
mode: ContinuousDeployment
continuous-delivery-fallback-tag: ''
branches:
master:
regex: ^master
tag: beta
pre-release-weight: 30000 # 0 after stable release, 15000 before alpha release, 30000 before beta release, 50000 before stable release
|
Remove ci prefix for releases
|
Remove ci prefix for releases
|
YAML
|
mit
|
sauliusg/jabref,sauliusg/jabref,sauliusg/jabref,Siedlerchr/jabref,Siedlerchr/jabref,Siedlerchr/jabref,JabRef/jabref,JabRef/jabref,Siedlerchr/jabref,JabRef/jabref,sauliusg/jabref,JabRef/jabref
|
9ad580fed8c7c7fbb1288e39d929ced21457e4f1
|
.github/workflows/main.yml
|
.github/workflows/main.yml
|
name: CI
on:
# Triggers the workflow on push or pull request events but only for the master branch
push:
branches: [ b2.5, b3.0, master]
pull_request:
branches: [ b2.5, b3.0, master, SPARKC-651 ]
workflow_dispatch:
jobs:
build:
runs-on: ubuntu-latest
strategy:
matrix:
scala: [2.11.12]
db-version: [6.8.13, 5.1.24, 3.11.10, 4.0-rc2]
steps:
- uses: actions/checkout@v2
- name: ccm pip installation
uses: BSFishy/pip-action@v1
with:
packages: git+git://github.com/riptano/ccm.git@435f3210e16d0b648fbf33d6390d5ab4c9e630d4
- name: Setup Scala
uses: olafurpg/setup-scala@v10
with:
java-version: "[email protected]"
- name: sbt tests
env:
TEST_PARALLEL_TASKS: 1
CCM_CASSANDRA_VERSION: ${{ matrix.db-version }}
PUBLISH_VERSION: test
run: sbt/sbt ++${{ matrix.scala }} test it:test
|
name: CI
on:
# Triggers the workflow on push or pull request events but only for the master branch
push:
branches: [ b2.5, b3.0, master]
pull_request:
branches: [ b2.5, b3.0, master, SPARKC-651 ]
workflow_dispatch:
jobs:
build:
runs-on: ubuntu-latest
strategy:
matrix:
scala: [2.11.12]
db-version: [6.8.13, 5.1.24, 3.11.10, 4.0-rc2]
steps:
- uses: actions/checkout@v2
- name: ccm pip installation
uses: BSFishy/pip-action@v1
with:
packages: git+https://github.com/riptano/ccm.git@435f3210e16d0b648fbf33d6390d5ab4c9e630d4
- name: Setup Scala
uses: olafurpg/setup-scala@v10
with:
java-version: "[email protected]"
- name: sbt tests
env:
TEST_PARALLEL_TASKS: 1
CCM_CASSANDRA_VERSION: ${{ matrix.db-version }}
PUBLISH_VERSION: test
run: sbt/sbt ++${{ matrix.scala }} test it:test
|
Switch ccm dependency URL from git to https protocol
|
Switch ccm dependency URL from git to https protocol
|
YAML
|
apache-2.0
|
datastax/spark-cassandra-connector,datastax/spark-cassandra-connector
|
ead12256cc8b4bbbf1a49f17d28b797c8f141c8f
|
.github/workflows/ruby.yml
|
.github/workflows/ruby.yml
|
name: Ruby
on: [push]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Set up Ruby 2.6
uses: actions/setup-ruby@v1
with:
ruby-version: 2.6.x
- name: Build and test with Rake
run: |
gem install bundler
bundle install --jobs 4 --retry 3
bundle exec rake test
|
name: Ruby
on:
push:
schedule:
- cron: "59 11 * * *"
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Set up Ruby 2.6
uses: actions/setup-ruby@v1
with:
ruby-version: 2.6.x
- name: Build and test with Rake
run: |
gem install bundler
bundle install --jobs 4 --retry 3
bundle exec rake test
|
Add cron to run build each night
|
Add cron to run build each night
|
YAML
|
mit
|
psyomn/komplement,psyomn/komplement
|
76f66c245ea54627a4d189930df4caf6e3e02dfa
|
.github/workflows/rust.yml
|
.github/workflows/rust.yml
|
on:
pull_request:
push:
branches:
- master
schedule:
- cron: '00 01 * * *'
name: CI
env:
CARGO_INCREMENTAL: 0
jobs:
build-and-test:
name: build
runs-on: ubuntu-latest
env:
RUSTFLAGS: -D warnings
steps:
- uses: actions/checkout@v2
- uses: actions-rs/toolchain@v1
with:
toolchain: stable
- uses: actions-rs/cargo@v1
with:
command: build
args: --all-targets --all-features
- uses: actions-rs/cargo@v1
with:
command: test
- uses: actions-rs/cargo@v1
with:
command: doc
rustfmt:
name: rustfmt
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions-rs/toolchain@v1
with:
profile: minimal
toolchain: stable
components: rustfmt
- uses: actions-rs/cargo@v1
with:
command: fmt
args: -- --check
clippy:
name: clippy
runs-on: ubuntu-latest
steps:
- name: Install system dependencies
run: sudo apt-get install libssl-dev
- uses: actions/checkout@v2
- uses: actions-rs/toolchain@v1
with:
profile: minimal
toolchain: stable
components: clippy
- uses: actions-rs/cargo@v1
with:
command: clippy
|
on:
pull_request:
push:
branches:
- master
schedule:
- cron: '00 01 * * *'
name: CI
env:
CARGO_INCREMENTAL: 0
jobs:
build-and-test:
name: build
runs-on: ubuntu-latest
env:
RUSTFLAGS: -D warnings
steps:
- name: Install system dependencies
run: sudo apt-get install libssl-dev
- uses: actions/checkout@v2
- uses: actions-rs/toolchain@v1
with:
toolchain: stable
- uses: actions-rs/cargo@v1
with:
command: build
args: --all-targets --all-features
- uses: actions-rs/cargo@v1
with:
command: test
- uses: actions-rs/cargo@v1
with:
command: doc
rustfmt:
name: rustfmt
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions-rs/toolchain@v1
with:
profile: minimal
toolchain: stable
components: rustfmt
- uses: actions-rs/cargo@v1
with:
command: fmt
args: -- --check
clippy:
name: clippy
runs-on: ubuntu-latest
steps:
- name: Install system dependencies
run: sudo apt-get install libssl-dev
- uses: actions/checkout@v2
- uses: actions-rs/toolchain@v1
with:
profile: minimal
toolchain: stable
components: clippy
- uses: actions-rs/cargo@v1
with:
command: clippy
|
Install OpenSSL as well for builds and tests.
|
Install OpenSSL as well for builds and tests.
|
YAML
|
mit
|
jeschkies/letterboxd-list-sync
|
43075a89c5bc89a9174811e77c927e60fd34673a
|
.github/workflows/test.yml
|
.github/workflows/test.yml
|
name: Test
on: [push]
jobs:
build:
runs-on: ubuntu-latest
strategy:
matrix:
node-version: [12.x, 14.x]
steps:
- uses: actions/checkout@v2
- name: Use Node.js ${{ matrix.node-version }}
uses: actions/setup-node@v1
with:
node-version: ${{ matrix.node-version }}
- run: npm install --global [email protected]
- run: pnpm install
- run: pnpm test
|
name: Test
on: [push]
jobs:
build:
runs-on: ubuntu-latest
strategy:
matrix:
node-version: [12.x, 14.x]
steps:
- uses: actions/checkout@v2
- name: Use Node.js ${{ matrix.node-version }}
uses: actions/setup-node@v2
with:
node-version: ${{ matrix.node-version }}
- run: npm install --global [email protected]
- run: pnpm install
- run: pnpm test
|
Update actions/setup-node action to v2
|
Update actions/setup-node action to v2
|
YAML
|
mit
|
elliottsj/front-matter-loader
|
f43be5455ac896abcbf9475386607f099c190734
|
.github/workflows/test.yml
|
.github/workflows/test.yml
|
---
name: Continuous Integration
"on":
- push
- pull_request
jobs:
tests:
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
ruby:
- 2.6.8
# TODO - 2.7
gemfile:
- gemfiles/rails4.2.gemfile
- gemfiles/rails5.0.gemfile
- gemfiles/rails5.1.gemfile
- gemfiles/rails5.2.gemfile
- gemfiles/rails6.0.gemfile
env:
BUNDLE_GEMFILE: ${{ matrix.gemfile }}
steps:
- uses: zendesk/checkout@v2
- uses: zendesk/setup-ruby@v1
with:
ruby-version: ${{ matrix.ruby }}
bundler-cache: true
- name: test
run: bundle exec rake test
services:
mysql:
image: mysql:5.7
ports:
- 3306:3306
env:
MYSQL_ROOT_PASSWORD: ''
MYSQL_ALLOW_EMPTY_PASSWORD: 'yes'
options: >-
--health-cmd="mysqladmin ping"
--health-interval=2s
--health-timeout=1s
--health-retries=30
validate:
runs-on: ubuntu-latest
steps:
- uses: zendesk/checkout@v2
- uses: zendesk/setup-ruby@v1
with:
ruby-version: 2.6.8
bundler-cache: true
- name: rubocop
run: bundle exec rubocop
|
---
name: Continuous Integration
"on":
- push
- pull_request
jobs:
tests:
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
ruby:
- 2.6
- 2.7
gemfile:
- gemfiles/rails4.2.gemfile
- gemfiles/rails5.0.gemfile
- gemfiles/rails5.1.gemfile
- gemfiles/rails5.2.gemfile
- gemfiles/rails6.0.gemfile
exclude:
- ruby: 2.7
gemfile: gemfiles/rails4.2.gemfile
env:
BUNDLE_GEMFILE: ${{ matrix.gemfile }}
steps:
- uses: zendesk/checkout@v2
- uses: zendesk/setup-ruby@v1
with:
ruby-version: ${{ matrix.ruby }}
bundler-cache: true
- name: test
run: bundle exec rake test
services:
mysql:
image: mysql:5.7
ports:
- 3306:3306
env:
MYSQL_ROOT_PASSWORD: ''
MYSQL_ALLOW_EMPTY_PASSWORD: 'yes'
options: >-
--health-cmd="mysqladmin ping"
--health-interval=2s
--health-timeout=1s
--health-retries=30
validate:
runs-on: ubuntu-latest
steps:
- uses: zendesk/checkout@v2
- uses: zendesk/setup-ruby@v1
with:
ruby-version: 2.6
bundler-cache: true
- name: rubocop
run: bundle exec rubocop
|
Add ruby 2.7, switch to 2.6 to pick up newer versions.
|
Add ruby 2.7, switch to 2.6 to pick up newer versions.
|
YAML
|
apache-2.0
|
zendesk/kasket
|
50c090a5602f3962b87fa0289b60e526832554b2
|
.github/workflows/test.yml
|
.github/workflows/test.yml
|
name: Perl module test
on: [push, pull_request]
jobs:
perl-job:
runs-on: ${{ matrix.os }}
strategy:
matrix:
os: ['ubuntu-latest']
perl: [ 'latest', '5.32', '5.30' ]
name: Perl ${{ matrix.perl }} on ${{ matrix.os }}
container:
image: perldocker/perl-tester:${{ matrix.perl }}
steps:
- uses: actions/checkout@v2
- name: Regular tests
run: |
dzil authordeps --missing | cpanm --notest
dzil listdeps --author --missing | cpanm --notest
dzil test --author --release
|
name: Perl module test
on: [push, pull_request]
jobs:
perl-job:
runs-on: ${{ matrix.os }}
strategy:
fail-fast: false
matrix:
os: ['ubuntu-latest']
perl: [ 'latest', '5.32', '5.30' ]
name: Perl ${{ matrix.perl }} on ${{ matrix.os }}
container:
image: perldocker/perl-tester:${{ matrix.perl }}
steps:
- uses: actions/checkout@v2
- name: Regular tests
run: |
dzil authordeps --missing | cpanm --notest
dzil listdeps --author --missing | cpanm --notest
dzil test --author --release
|
Set fail-fast to false in github action
|
Set fail-fast to false in github action
|
YAML
|
apache-2.0
|
elastic/elasticsearch-perl,elastic/elasticsearch-perl
|
0a6a0459d3924f46ba73a42220f15b6db1cd3763
|
post.yml
|
post.yml
|
---
- name: set body content from file
set_fact:
payload: "{{ lookup('file', file) }}"
- name: dump payload to terminal
debug: var=payload
when: verbose
- name: POST request
uri:
url: "{{ url }}"
method: POST
user: "{{ user }}"
password: "{{ password }}"
body: "{{ payload }}"
force_basic_auth: yes
return_content: yes
HEADER_Content-Type: "application/xml"
status_code: 201
register: response
- name: save response
copy:
content: "{{ response.location }}"
dest: "{{ playbook_dir }}/{{ savefile | default('response.txt') }}"
|
---
- name: set body content from file
set_fact:
payload: "{{ lookup('file', file) }}"
- name: dump payload to terminal
debug: var=payload
when: verbose
- name: POST request
uri:
url: "{{ url }}"
method: POST
user: "{{ user }}"
password: "{{ password }}"
body: "{{ payload }}"
force_basic_auth: yes
return_content: yes
HEADER_Content-Type: "application/xml"
status_code: 201
register: response
- name: save response
lineinfile:
line: "{{ response.location }}"
dest: "{{ playbook_dir }}/{{ savefile | default('response.txt') }}"
create: yes
|
Append POST results to file for easy batch delete
|
Append POST results to file for easy batch delete
|
YAML
|
mit
|
lyrasis/csible,lyrasis/csible
|
e665e794003173cfba6e850b03b5ba3149101150
|
.forestry/front_matter/templates/person.yml
|
.forestry/front_matter/templates/person.yml
|
---
label: person
hide_body: false
is_partial: false
fields:
- type: text
name: title
label: Name
- name: sub_heading
label: Sub Heading
type: text
hidden: false
default: ''
- type: file
label: Thumbnail
name: thumbnail
description: 'Square image, Max 500 pixels '
- type: tag_list
label: Roles
name: roles
description: What roles the person has within PANOPTES
- name: email
label: Email
type: text
hidden: false
default: ''
- name: personal_url
label: Webpage
type: text
hidden: false
default: ''
description: If the member has a url they would like linked.
- name: layout
label: Layout
type: text
hidden: true
default: person
pages:
- _people/olivier-guyon.md
- _people/josh-walawender.md
- _people/nem-jovanovic.md
- _people/wilfred-gee.md
- _people/mike-butterfield.md
- _people/kathy-guyon.md
- _people/jen-tong.md
|
---
label: person
hide_body: false
is_partial: false
fields:
- type: text
name: title
label: Name
- name: sub_heading
label: Sub Heading
type: text
hidden: false
default: ''
- type: file
label: Thumbnail
name: thumbnail
description: 'Square image, Max 500 pixels '
- type: tag_list
label: Roles
name: roles
description: What roles the person has within PANOPTES
- name: email
label: Email
type: text
hidden: false
default: ''
- name: layout
label: Layout
type: text
hidden: true
default: person
- type: field_group_list
name: links
label: Links
description: Any webpages for the person.
fields:
- type: text
name: label
label: Label
description: Name for web page.
- type: text
name: href
label: href
description: Link to web page.
pages:
- _people/olivier-guyon.md
- _people/josh-walawender.md
- _people/nem-jovanovic.md
- _people/wilfred-gee.md
- _people/mike-butterfield.md
- _people/kathy-guyon.md
- _people/jen-tong.md
|
Update from Forestry.io - Updated Forestry configuration
|
Update from Forestry.io - Updated Forestry configuration
|
YAML
|
mit
|
panoptes/panoptes.github.io,panoptes/panoptes.github.io,panoptes/panoptes.github.io
|
690ca765fc03b9252bc15e589fbce8f3d2b4c283
|
metadata/taco.scoop.yml
|
metadata/taco.scoop.yml
|
Categories:
- System
- Development
License: Apache-2.0
SourceCode: https://github.com/TacoTheDank/Scoop
IssueTracker: https://github.com/TacoTheDank/Scoop/issues
Changelog: https://github.com/TacoTheDank/Scoop/releases
AutoName: Scoop
RepoType: git
Repo: https://github.com/TacoTheDank/Scoop
Builds:
- versionName: 2.1.3
versionCode: 28
commit: v2.1.3
subdir: app
gradle:
- yes
- versionName: 2.2.0
versionCode: 29
commit: c5194605861c9a5f15b6e6fc53313180fb727d1d
subdir: app
gradle:
- yes
scandelete:
- app/libs
AutoUpdateMode: Version v%v
UpdateCheckMode: Tags
CurrentVersion: 2.2.0
CurrentVersionCode: 29
|
Categories:
- System
- Development
License: Apache-2.0
SourceCode: https://github.com/TacoTheDank/Scoop
IssueTracker: https://github.com/TacoTheDank/Scoop/issues
Changelog: https://github.com/TacoTheDank/Scoop/releases
AutoName: Scoop
RepoType: git
Repo: https://github.com/TacoTheDank/Scoop
Builds:
- versionName: 2.1.3
versionCode: 28
commit: v2.1.3
subdir: app
gradle:
- yes
- versionName: 2.2.0
versionCode: 29
commit: c5194605861c9a5f15b6e6fc53313180fb727d1d
subdir: app
gradle:
- yes
scandelete:
- app/libs
- versionName: 2.3.0
versionCode: 30
commit: 8c09aea822265f9b8d5b9529d82b8860e3c5cb4e
subdir: app
gradle:
- yes
scandelete:
- app/libs
AutoUpdateMode: Version v%v
UpdateCheckMode: Tags
CurrentVersion: 2.3.0
CurrentVersionCode: 30
|
Update Scoop to 2.3.0 (30)
|
Update Scoop to 2.3.0 (30)
|
YAML
|
agpl-3.0
|
f-droid/fdroiddata,f-droid/fdroiddata
|
4bd7ad6e17701a84e30e4ef3c494b032e15f673f
|
model/actual_group.yaml
|
model/actual_group.yaml
|
# Copyright 2019 Alain Dargelas
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# Universal Hardware Data Model (UHDM) "actual group" formal description
- group_def: actual_group
- obj_ref: interface
- obj_ref: interface_array
- obj_ref: modport
- class_ref: nets
- class_ref: variables
- obj_ref: named_event
- obj_ref: named_event_array
- obj_ref: part_select
|
# Copyright 2019 Alain Dargelas
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# Universal Hardware Data Model (UHDM) "actual group" formal description
# Parameters are not in the actual group in the Standard, we add them to the group for binding convenience
- group_def: actual_group
- obj_ref: interface
- obj_ref: interface_array
- obj_ref: modport
- class_ref: nets
- class_ref: variables
- obj_ref: named_event
- obj_ref: named_event_array
- obj_ref: part_select
- obj_ref: parameter
|
Add parameter to actual group
|
Add parameter to actual group
|
YAML
|
apache-2.0
|
chipsalliance/UHDM,chipsalliance/UHDM,chipsalliance/UHDM
|
ca6d22343f61d4154cba1257a7366ce4ff462780
|
.github/workflows/cdaf.yml
|
.github/workflows/cdaf.yml
|
name: CDAF Targetless CD using hosted agent
# This workflow is triggered on pushes to the repository.
on: [push]
jobs:
build:
name: Execute all steps on Single Agent
runs-on: windows-latest
env:
DOCKERHUB_TOKEN: ${{ secrets.DOCKERHUB_TOKEN }}
COMPOSE_KEEP: yes
steps:
- name: Checkout
uses: actions/checkout@v2
- name: Fetch all history for all tags and branches
run: |
git config remote.origin.url https://x-access-token:${{ secrets.GITHUB_TOKEN }}@github.com/${{ github.repository }}
git fetch --prune --unshallow
- name: Execute Compose for all branches
shell: powershell # pwsh for PowerShell Core
run: |
cd ${env:GITHUB_WORKSPACE}
curl.exe -s https://codeload.github.com/cdaf/windows/zip/refs/heads/master -o windows.zip
Expand-Archive .\windows.zip .
.\windows-master\automation\entry.ps1 ${env:GITHUB_RUN_NUMBER} ${env:GITHUB_REF}
- name: Login to Docker Hub
uses: docker/login-action@v1
if: ${{ env.DOCKERHUB_TOKEN }} && ( github.ref == 'refs/heads/master' )
with:
username: cdaf
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: PUSH
if: ${{ env.DOCKERHUB_TOKEN }} && ( github.ref == 'refs/heads/master' )
run: |
cd ${env:GITHUB_WORKSPACE}
.\TasksLocal\delivery.bat PUSH
|
name: CDAF Targetless CD using hosted agent
# This workflow is triggered on pushes to the repository.
on: [push]
jobs:
build:
name: Execute all steps on Single Agent
runs-on: windows-latest
env:
DOCKERHUB_TOKEN: ${{ secrets.DOCKERHUB_TOKEN }}
COMPOSE_KEEP: yes
steps:
- name: Checkout
uses: actions/checkout@v2
- name: Fetch all history for all tags and branches
run: |
git config remote.origin.url https://x-access-token:${{ secrets.GITHUB_TOKEN }}@github.com/${{ github.repository }}
git fetch --prune --unshallow
- name: Execute Compose for all branches
shell: powershell # pwsh for PowerShell Core
run: |
cd ${env:GITHUB_WORKSPACE}
curl.exe -s https://codeload.github.com/cdaf/windows/zip/refs/heads/master -o windows.zip
Expand-Archive .\windows.zip .
.\windows-master\automation\entry.ps1 ${env:GITHUB_RUN_NUMBER} ${env:GITHUB_REF}
- name: Login to Docker Hub
uses: docker/login-action@v1
if: ${{ env.DOCKERHUB_TOKEN }} && ( github.ref == 'refs/heads/master' )
with:
username: cdaf
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: PUSH
if: ${{ env.DOCKERHUB_TOKEN }} && ( github.ref == 'refs/heads/master' )
shell: powershell # pwsh for PowerShell Core
run: |
cd ${env:GITHUB_WORKSPACE}
.\TasksLocal\delivery.bat PUSH
|
Use Powershell desktop, not core which is default when not defined
|
Use Powershell desktop, not core which is default when not defined
|
YAML
|
apache-2.0
|
cdaf/cbe,Semprini/cbe,Semprini/cbe,cdaf/cbe
|
048d2b804f88bf2bb55bee91ad27c25e3fb68648
|
.github/workflows/main.yml
|
.github/workflows/main.yml
|
# Action was shamelessly copied from here: https://stackoverflow.com/a/64311970
name: github pages
on:
push:
branches:
- main
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Setup Python
uses: actions/setup-python@v2
with:
python-version: '3.9.4'
- name: Upgrade pip
run: |
python3 -m pip install --upgrade pip
- name: Get pip cache dir
id: pip-cache
run: echo "::set-output name=dir::$(pip cache dir)"
- name: Cache dependencies
uses: actions/cache@v2
with:
path: ${{ steps.pip-cache.outputs.dir }}
key: ${{ runner.os }}-pip-${{ hashFiles('**/requirements.txt') }}
restore-keys: |
${{ runner.os }}-pip-
- name: Install dependencies
run: python3 -m pip install -r ./requirements.txt
- name: Generate HTML
run: |
make html
echo academics.cs.luc.edu > build/html/CNAME
- name: Deploy
uses: peaceiris/actions-gh-pages@v3
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: ./build/html
|
# Action was shamelessly copied from here: https://stackoverflow.com/a/64311970
name: github pages
on:
push:
branches:
- main
- master
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Setup Python
uses: actions/setup-python@v2
with:
python-version: '3.9.4'
- name: Upgrade pip
run: |
python3 -m pip install --upgrade pip
- name: Get pip cache dir
id: pip-cache
run: echo "::set-output name=dir::$(pip cache dir)"
- name: Cache dependencies
uses: actions/cache@v2
with:
path: ${{ steps.pip-cache.outputs.dir }}
key: ${{ runner.os }}-pip-${{ hashFiles('**/requirements.txt') }}
restore-keys: |
${{ runner.os }}-pip-
- name: Install dependencies
run: python3 -m pip install -r ./requirements.txt
- name: Generate HTML
run: |
make html
echo academics.cs.luc.edu > build/html/CNAME
- name: Deploy
uses: peaceiris/actions-gh-pages@v3
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: ./build/html
|
Add master branch as a branch to build updates on
|
Add master branch as a branch to build updates on
|
YAML
|
apache-2.0
|
LoyolaChicagoCS/coursedescriptions,LoyolaChicagoCS/coursedescriptions,LoyolaChicagoCS/coursedescriptions
|
0d1fe14e9e98c91cee22e9e62dcd9dc958227abb
|
.github/workflows/main.yml
|
.github/workflows/main.yml
|
name: CI
on:
# Triggers the workflow on push or pull request events but only for the master branch
push:
branches: [ master ]
pull_request:
branches: [ master ]
# Allows you to run this workflow manually from the Actions tab
workflow_dispatch:
jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: php-actions/composer@v5
- name: PHPUnit Tests
uses: php-actions/phpunit@v2
with:
php_extensions: xdebug
bootstrap: vendor/autoload.php
configuration: phpunit.xml
args: --coverage-text
env:
XDEBUG_MODE: coverage
|
name: CI
on:
# Triggers the workflow on push or pull request events but only for the master branch
push:
branches: [ master ]
pull_request:
branches: [ master ]
# Allows you to run this workflow manually from the Actions tab
workflow_dispatch:
jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: php-actions/composer@v5
- name: PHPUnit Tests
uses: php-actions/phpunit@v2
with:
php_version: 7.4
php_extensions: xdebug
bootstrap: vendor/autoload.php
configuration: phpunit.xml
args: --coverage-text
env:
XDEBUG_MODE: coverage
|
Set CI php version to 7.4
|
Set CI php version to 7.4
|
YAML
|
mit
|
t1gor/Robots.txt-Parser-Class
|
7f426e644653ed447b30f526c8e4f050eb66b86a
|
.github/workflows/main.yml
|
.github/workflows/main.yml
|
name: CI
on: [push, pull_request]
jobs:
build:
runs-on: ubuntu-latest
strategy:
matrix:
ruby: [ '2.5', '2.6', '2.7', '3.0' ]
minimal: [ false, true ]
name: Ruby ${{ matrix.ruby }} tests, minimal=${{ matrix.minimal }}
steps:
- uses: actions/checkout@v2
- name: Setup Ruby
uses: ruby/setup-ruby@v1
with:
ruby-version: ${{ matrix.ruby }}
- uses: actions/cache@v2
with:
path: vendor/bundle
key: ${{ runner.os }}-${{ matrix.ruby }}-gems-${{ hashFiles('Gemfile', 'frozen_record.gemspec') }}
restore-keys: |
${{ runner.os }}-${{ matrix.ruby }}-gems-
- name: Bundle install
run: |
gem install bundler
bundle install --jobs 4 --retry 3 --path=vendor/bundle
- name: Run tests
env:
MINIMAL: ${{ matrix.minimal }}
run: bundle exec rake
|
name: CI
on: [push, pull_request]
jobs:
build:
runs-on: ubuntu-latest
strategy:
matrix:
ruby: [ '2.5', '2.6', '2.7', '3.0' , '3.1']
minimal: [ false, true ]
name: Ruby ${{ matrix.ruby }} tests, minimal=${{ matrix.minimal }}
steps:
- uses: actions/checkout@v2
- name: Setup Ruby
uses: ruby/setup-ruby@v1
with:
ruby-version: ${{ matrix.ruby }}
- uses: actions/cache@v2
with:
path: vendor/bundle
key: ${{ runner.os }}-${{ matrix.ruby }}-gems-${{ hashFiles('Gemfile', 'frozen_record.gemspec') }}
restore-keys: |
${{ runner.os }}-${{ matrix.ruby }}-gems-
- name: Bundle install
run: |
gem install bundler
bundle install --jobs 4 --retry 3 --path=vendor/bundle
- name: Run tests
env:
MINIMAL: ${{ matrix.minimal }}
run: bundle exec rake
|
Add Ruby 3.1 to the CI matrix
|
Add Ruby 3.1 to the CI matrix
|
YAML
|
mit
|
byroot/frozen_record
|
e8bcb4439ed9dd597f1cbf3e5200f43e5456e246
|
.github/workflows/main.yml
|
.github/workflows/main.yml
|
# This is a basic workflow to help you get started with Actions
name: CI
on:
push:
branches: [ master ]
pull_request:
branches: [ master ]
workflow_dispatch:
jobs:
test:
runs-on: ubuntu-18.04
steps:
- uses: actions/checkout@v3
- name: Setup Node.js environment
uses: actions/[email protected]
with:
node-version: 16.5.0
- name: Reuse npm cache folder
uses: actions/cache@v3
env:
cache-name: cache-node-modules
with:
path: |
~/.npm
./node_modules
# invalidate cache when any package-lock.json changes
key: ${{ runner.os }}-npm-test-x1-${{ hashFiles('**/package.json') }}
restore-keys: |
${{ runner.os }}-npm-test-x1-
- name: install npm dependencies
# TODO remove legace peer deps once everything is up to date
run: npm install --legacy-peer-deps
- name: build
run: npm run build
- name: test:node
run: npm run test:node
- name: test:browser
uses: GabrielBB/xvfb-action@v1
with:
run: npm run test:browser
- name: lint
run: npm run lint
|
# This is a basic workflow to help you get started with Actions
name: CI
on:
push:
branches: [ master ]
pull_request:
branches: [ master ]
workflow_dispatch:
jobs:
test:
runs-on: ubuntu-18.04
steps:
- uses: actions/checkout@v3
- name: Setup Node.js environment
uses: actions/[email protected]
with:
node-version: 16.5.0
- name: Reuse npm cache folder
uses: actions/cache@v3
env:
cache-name: cache-node-modules
with:
path: |
~/.npm
./node_modules
# invalidate cache when any package-lock.json changes
key: ${{ runner.os }}-npm-test-x1-${{ hashFiles('**/package.json') }}
restore-keys: |
${{ runner.os }}-npm-test-x1-
- name: install npm dependencies
# TODO remove legace peer deps once everything is up to date
run: npm install --legacy-peer-deps
- name: build
run: npm run build
- name: test:node
run: npm run test:node
- name: test:browser
uses: GabrielBB/xvfb-action@v1
with:
run: npm run test:browser
- name: lint
run: npm run lint
|
Update actions/setup-node action to v3.4.0
|
Update actions/setup-node action to v3.4.0
|
YAML
|
apache-2.0
|
pubkey/async-test-util
|
306c24eab8e3a55bd630a63d44945bffab25ee4c
|
.github/workflows/rust.yml
|
.github/workflows/rust.yml
|
name: Rust
on:
push:
branches:
- master
pull_request: {}
jobs:
check-format:
runs-on: ubuntu-18.04
steps:
- uses: actions/checkout@v1
- uses: actions-rs/cargo@v1
with:
command: fmt
args: -- --check
clippy:
runs-on: ubuntu-18.04
steps:
- uses: actions/checkout@v1
- uses: actions-rs/cargo@v1
with:
command: clippy
coverage:
runs-on: ubuntu-18.04
container:
image: rustmath/mkl-rust:1.43.0
options: --security-opt seccomp=unconfined
steps:
- uses: actions/checkout@v2
- name: Generate code coverage
run: |
cargo tarpaulin --verbose --features=intel-mkl --out Xml --manifest-path=ndarray-linalg/Cargo.toml
- name: Upload to codecov.io
uses: codecov/codecov-action@v1
|
name: Rust
on:
push:
branches:
- master
pull_request: {}
jobs:
check-format:
runs-on: ubuntu-18.04
steps:
- uses: actions/checkout@v1
- uses: actions-rs/cargo@v1
with:
command: fmt
args: -- --check
clippy:
runs-on: ubuntu-18.04
steps:
- uses: actions/checkout@v1
- uses: actions-rs/cargo@v1
with:
command: clippy
coverage:
runs-on: ubuntu-18.04
container:
image: rustmath/mkl-rust:1.43.0
options: --security-opt seccomp=unconfined
steps:
- uses: actions/checkout@v2
- name: Generate code coverage
run: |
cargo tarpaulin --verbose --features=intel-mkl --out Xml --manifest-path=ndarray-linalg/Cargo.toml
- name: Upload to codecov.io
uses: codecov/codecov-action@v1
doc:
runs-on: ubuntu-18.04
steps:
- uses: actions/checkout@v2
- name: Generate code coverage
run: |
RUSTDOCFLAGS="--html-in-header katex-header.html" cargo doc --no-deps
mv target/doc public
- name: Deploy GitHub Pages
uses: peaceiris/actions-gh-pages@v3
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: ./public
|
Add Action setting for build and deploy cargo doc
|
Add Action setting for build and deploy cargo doc
|
YAML
|
mit
|
termoshtt/ndarray-linalg
|
418e8f57d900f7752b2c95f5f1a340614cd15a31
|
.github/workflows/rust.yml
|
.github/workflows/rust.yml
|
name: Rust
on:
push:
branches: [ master ]
pull_request:
branches: [ master ]
env:
CARGO_TERM_COLOR: always
jobs:
test:
runs-on: ubuntu-latest
strategy:
fast-fail: false
matrix:
rust: [stable, beta, nightly]
steps:
- uses: actions/checkout@v2
- name: Install Rust
run: rustup update ${{ matrix.rust }} && rustup default ${{ matrix.rust }}
- name: Run Tests
run: cargo test --verbose
- run: cargo build --all --all-features --all-targets
- name: Catch missing feature flags
if: startsWith(matrix.rust, 'nightly')
run: cargo check -Z features=dev_dep
- name: Install cargo-hack
uses: taiki-e/install-action@cargo-hack
- run: rustup target add thumbv7m-none-eabi
- name: Ensure we don't depend on libstd
run: cargo hack build --target thumbv7m-none-eabi --no-dev-deps --no-default-features
miri:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Install Rust
run: rustup update nightly --component miri && rustup default nightly
- run: cargo miri test
env:
MIRIFLAGS: -Zmiri-strict-provenance -Zmiri-symbolic-alignment-check -Zmiri-disable-isolation
RUSTFLAGS: ${{ env.RUSTFLAGS }} -Z randomize-layout
|
name: Rust
on:
push:
branches: [ master ]
pull_request:
branches: [ master ]
env:
CARGO_TERM_COLOR: always
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Build
run: cargo build --verbose
- name: Run tests
run: cargo test --verbose
|
Revert "Add more tests to the CI"
|
Revert "Add more tests to the CI"
|
YAML
|
mit
|
mvdnes/spin-rs
|
6e43951c677e2378f12c101b83d3f658de3d3b0f
|
.github/workflows/test.yml
|
.github/workflows/test.yml
|
name: Test
on:
push:
branches: [ main ]
pull_request:
branches: [ main ]
jobs:
test:
strategy:
matrix:
php-versions: ['5.6', '7.4', '8.0', '8.1']
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v2
- name: Setup PHP with Xdebug
uses: shivammathur/setup-php@v2
with:
php-version: '7.4'
coverage: xdebug
- name: Validate composer.json and composer.lock
run: composer validate --strict
- name: Cache Composer packages
id: composer-cache
uses: actions/cache@v2
with:
path: vendor
key: ${{ runner.os }}-php-${{ hashFiles('**/composer.lock') }}
restore-keys: |
${{ runner.os }}-php-
- name: Install dependencies
run: composer install --dev --prefer-dist --no-progress --no-interaction
- name: Run test suite
run: composer run-script test
- name: Upload coverage results to Coveralls
env:
COVERALLS_REPO_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: php vendor/bin/php-coveralls --coverage_clover=build/logs/clover.xml -v
|
name: Test
on:
push:
branches: [ main ]
pull_request:
branches: [ main ]
jobs:
test:
strategy:
matrix:
php-versions: ['5.6', '7.4', '8.0', '8.1']
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v2
- name: Setup PHP with Xdebug
uses: shivammathur/setup-php@v2
with:
php-version: ${{ matrix.php-versions }}
coverage: xdebug
- name: Validate composer.json and composer.lock
run: composer validate --strict
- name: Cache Composer packages
id: composer-cache
uses: actions/cache@v2
with:
path: vendor
key: ${{ runner.os }}-php-${{ hashFiles('**/composer.lock') }}
restore-keys: |
${{ runner.os }}-php-
- name: Install dependencies
run: composer install --dev --prefer-dist --no-progress --no-interaction
- name: Run test suite
run: composer run-script test
- name: Upload coverage results to Coveralls
env:
COVERALLS_REPO_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: php vendor/bin/php-coveralls --coverage_clover=build/logs/clover.xml -v
|
Use matrix of PHP versions.
|
Use matrix of PHP versions.
|
YAML
|
mit
|
Dreamseer/php-shorten
|
4329d92b540b0973f606b707a28e12001dc0e603
|
install_files/ansible-base/roles/common/tasks/from_fpf_repo_install_grsec.yml
|
install_files/ansible-base/roles/common/tasks/from_fpf_repo_install_grsec.yml
|
---
# The ubuntu motd include displaying some system load averages.
# this causes a grsec: From 10.0.2.2: denied RWX mmap of <anonymous mapping>
# by /usr/bin/landscape-sysinfo[landscape-sysin:3393] uid/euid:0/0
# gid/egid:0/0, parent
# /usr/share/landscape/landscape-sysinfo.wrapper[50-landscape-sy:3386]
# uid/euid:0/0 gid/egid:0/0
# Disabling calling that script during login.
- name: remove motd pam module from ssh logins
lineinfile: dest=/etc/pam.d/sshd regexp=pam.motd state=absent backup=yes
- name: install grsec predepends paxctl package
apt: pkg=paxctl state=present
- name: make the required grub paxctl changes
command: paxctl -Cpm {{ item }}
with_items: grub_pax
- name: install grsec package from fpf repo
apt: pkg={{ grsec_package }} state=latest
async: 200
poll: 10
|
---
# The ubuntu motd include displaying some system load averages.
# this causes a grsec: From 10.0.2.2: denied RWX mmap of <anonymous mapping>
# by /usr/bin/landscape-sysinfo[landscape-sysin:3393] uid/euid:0/0
# gid/egid:0/0, parent
# /usr/share/landscape/landscape-sysinfo.wrapper[50-landscape-sy:3386]
# uid/euid:0/0 gid/egid:0/0
# Disabling calling that script during login.
- name: remove motd pam module from ssh logins
lineinfile: dest=/etc/pam.d/sshd regexp=pam.motd state=absent backup=yes
- name: install grsec predepends paxctl package
apt: pkg=paxctl state=present
- name: make the required grub paxctl changes
command: paxctl -Cpm {{ item }}
with_items: grub_pax
- name: install grsec package from fpf repo
apt: pkg={{ grsec_package }} state=latest
async: 300
poll: 10
|
Increase grsec install timeout to 5min
|
Increase grsec install timeout to 5min
|
YAML
|
agpl-3.0
|
mark-in/securedrop-prov-upstream,mark-in/securedrop-prov-upstream,mark-in/securedrop-prov-upstream,mark-in/securedrop-prov-upstream
|
8b29120ef6ae55084b8a24d554c002e442613bce
|
.reek.yml
|
.reek.yml
|
---
detectors:
IrresponsibleModule:
enabled: false
UncommunicativeModuleName:
enabled: false
TooManyMethods:
max_methods: 10
exclude:
- "Test"
TooManyStatements:
max_statements: 15
exclude:
- "Test"
UtilityFunction:
exclude:
- "Test"
TooManyConstants:
enabled: false
exclude_paths:
- bin
- node_modules
- public
- vendor
- db
|
---
detectors:
IrresponsibleModule:
enabled: false
UncommunicativeModuleName:
enabled: false
UncommunicativeVariableName:
accept_key: ['e']
TooManyMethods:
max_methods: 10
exclude:
- "Test"
TooManyStatements:
max_statements: 15
exclude:
- "Test"
UtilityFunction:
exclude:
- "Test"
TooManyConstants:
enabled: false
exclude_paths:
- bin
- node_modules
- public
- vendor
- db
|
Add standard rescue variable to the Reek rule exceptions
|
Add standard rescue variable to the Reek rule exceptions
|
YAML
|
mit
|
dreikanter/feeder,dreikanter/feeder,dreikanter/feeder
|
2ba876c75caf28d3902c029d6d91c6fd1f061f34
|
jenkins/ci.suse.de/templates/cloud-mkcloud-job-upgrade-nondisruptive-ha-without-nodes-upgrade-template.yaml
|
jenkins/ci.suse.de/templates/cloud-mkcloud-job-upgrade-nondisruptive-ha-without-nodes-upgrade-template.yaml
|
- job-template:
name: 'cloud-mkcloud{version}-job-upgrade-nondisruptive-ha-without-nodes-upgrade-{arch}'
node: cloud-trigger
triggers:
- timed: '32 22 * * *'
logrotate:
numToKeep: -1
daysToKeep: 7
builders:
- trigger-builds:
- project: openstack-mkcloud
condition: SUCCESS
block: true
current-parameters: true
predefined-parameters: |
TESTHEAD=1
cloudsource=develcloud{previous_version}
upgrade_cloudsource=develcloud{version}
nodenumber=4
hacloud=1
storage_method=swift
mkcloudtarget=plain_with_upgrade_test testsetup
label={label}
job_name=cloud-mkcloud{version}-job-upgrade-nondisruptive-ha-without-nodes-upgrade-{arch}
|
- job-template:
name: 'cloud-mkcloud{version}-job-upgrade-nondisruptive-ha-without-nodes-upgrade-{arch}'
node: cloud-trigger
triggers:
- timed: '32 22 * * *'
logrotate:
numToKeep: -1
daysToKeep: 7
builders:
- trigger-builds:
- project: openstack-mkcloud
condition: SUCCESS
block: true
current-parameters: true
predefined-parameters: |
TESTHEAD=1
cloudsource=develcloud{previous_version}
upgrade_cloudsource=develcloud{version}
nodenumber=4
hacloud=1
storage_method=swift
mkcloudtarget=plain_with_upgrade_test
label={label}
job_name=cloud-mkcloud{version}-job-upgrade-nondisruptive-ha-without-nodes-upgrade-{arch}
|
Remove testsetup from job that does not upgrade nodes
|
Remove testsetup from job that does not upgrade nodes
Wit this job, just admin server is installed, and nodes
do not even have Cloud7 repositories set up.
|
YAML
|
apache-2.0
|
gosipyan/automation,aspiers/automation,aspiers/automation,gosipyan/automation,vmoravec/automation,vmoravec/automation,gosipyan/automation,SUSE-Cloud/automation,gosipyan/automation,SUSE-Cloud/automation,vmoravec/automation,aspiers/automation,SUSE-Cloud/automation,aspiers/automation,vmoravec/automation,SUSE-Cloud/automation
|
dee889de1f38f7fa41d3c41f83574acce2df95bf
|
.github/workflows/main.yml
|
.github/workflows/main.yml
|
name: main
on: [push]
jobs:
check-tests:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ['3.7', '3.10', 'pypy3.8']
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python-version }}
- run: pip install --upgrade pip poetry
- run: poetry install
- run: poetry run make check-tests
check-dev:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v4
with:
python-version: '3.10'
- run: pip install --upgrade pip poetry
- run: poetry install
- run: poetry run make check-lint
- run: poetry run make check-format
- run: poetry run poetry build
release:
if: startsWith(github.ref, 'refs/tags')
runs-on: ubuntu-latest
needs: [check-tests, check-dev]
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v4
with:
python-version: '3.10'
- run: pip install --upgrade pip poetry
- run: poetry run poetry build
- uses: pypa/gh-action-pypi-publish@master
with:
user: __token__
password: ${{ secrets.pypi_token }}
|
name: main
on: [push]
jobs:
check-tests:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ['3.7', '3.10', 'pypy3.8']
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python-version }}
- run: pip install --upgrade pip poetry
- run: poetry install
- run: poetry run make check-tests
check-dev:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v4
with:
python-version: '3.10'
- run: pip install --upgrade pip poetry
- run: poetry install
- run: poetry run make check-lint
- run: poetry run make check-format
- run: poetry build
release:
if: startsWith(github.ref, 'refs/tags')
runs-on: ubuntu-latest
needs: [check-tests, check-dev]
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v4
with:
python-version: '3.10'
- run: pip install --upgrade pip poetry
- run: poetry build
- uses: pypa/gh-action-pypi-publish@master
with:
user: __token__
password: ${{ secrets.pypi_token }}
|
Remove redundant poetry run in action
|
Remove redundant poetry run in action
|
YAML
|
mit
|
bbc2/shuffled
|
e96282ad2f4cd9c3576e81a5a4df53dc66c84b8f
|
.github/workflows/main.yml
|
.github/workflows/main.yml
|
name: CI
on:
push:
branches:
- master
pull_request:
branches:
- master
jobs:
unit_tests:
name: Unit Tests
runs-on: macOS-latest
steps:
- name: Checkout
uses: actions/checkout@v1
- name: Install Gem Dependencies
run: bundle install
- name: Run UnitTests
run: bundle exec fastlane test
ui_tests:
name: UI Tests
runs-on: macOS-latest
steps:
- name: Checkout
uses: actions/checkout@v1
- name: Install Gem Dependencies
run: bundle install
- name: Run UnitTests
run: bundle exec fastlane ui_test
snapshot_tests:
name: Snapshot Tests
runs-on: macOS-latest
steps:
- name: Checkout
uses: actions/checkout@v1
- name: Install Gem Dependencies
run: bundle install
- name: Run SnapshotTests
run: bundle exec fastlane snapshot_test
spm:
name: SPM
runs-on: macOS-latest
steps:
- name: Checkout
uses: actions/checkout@v1
- name: Install Gem Dependencies
run: bundle install
- name: Run Swift Build
run: swift build
|
name: CI
on:
push:
branches:
- master
pull_request:
branches:
- master
jobs:
env:
DEVELOPER_DIR: /Applications/Xcode_11.3.app/Contents/Developer
unit_tests:
name: Unit Tests
runs-on: macOS-latest
steps:
- name: Checkout
uses: actions/checkout@v1
- name: Install Gem Dependencies
run: bundle install
- name: Run UnitTests
run: bundle exec fastlane test
ui_tests:
name: UI Tests
runs-on: macOS-latest
steps:
- name: Checkout
uses: actions/checkout@v1
- name: Install Gem Dependencies
run: bundle install
- name: Run UnitTests
run: bundle exec fastlane ui_test
snapshot_tests:
name: Snapshot Tests
runs-on: macOS-latest
steps:
- name: Checkout
uses: actions/checkout@v1
- name: Install Gem Dependencies
run: bundle install
- name: Run SnapshotTests
run: bundle exec fastlane snapshot_test
spm:
name: SPM
runs-on: macOS-latest
steps:
- name: Checkout
uses: actions/checkout@v1
- name: Install Gem Dependencies
run: bundle install
- name: Run Swift Build
run: swift build
|
Use Xcode 11.3 with env
|
Use Xcode 11.3 with env
Found here: https://github.community/t5/GitHub-Actions/Selecting-an-Xcode-version/m-p/32340#M1092
|
YAML
|
mit
|
criticalmaps/criticalmaps-ios,criticalmaps/criticalmaps-ios,headione/criticalmaps-ios,criticalmaps/criticalmaps-ios
|
19190fdfd3ca3d370594d4a2bc2a6e56436d06b3
|
.github/workflows/main.yml
|
.github/workflows/main.yml
|
name: Go
on:
push:
branches: [ master ]
pull_request:
branches: [ master ]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Set up Go
uses: actions/setup-go@v2
with:
go-version: 1.17
- name: Build
run: go build -v ./...
- name: Test
run: go test -v ./...
|
name: Go
on:
push:
branches: [ master ]
pull_request:
branches: [ master ]
jobs:
build:
runs-on: ubuntu-latest
strategy:
matrix:
go: [ '1.18', '1.17', '1.13' ]
steps:
- uses: actions/checkout@v3
- name: Set up Go
uses: actions/setup-go@v3
with:
go-version: ${{ matrix.go }}
- name: Build
run: go build -v ./...
- name: Test
run: go test -v ./...
|
Add min supported and two most recent Go versions to workflow
|
Add min supported and two most recent Go versions to workflow
|
YAML
|
mit
|
jstemmer/go-junit-report
|
400366786683ebe7f20a40fcabb5ffd32b3fc3f8
|
.github/workflows/main.yml
|
.github/workflows/main.yml
|
name: Tests & Linting
on: push
jobs:
run:
runs-on: ubuntu-20.04
strategy:
matrix:
php-versions: ['7.2', '7.3', '7.4']
fail-fast: false
name: PHP ${{ matrix.php-versions }}
steps:
- name: Checkout
uses: actions/[email protected]
- name: Setup PHP
uses: shivammathur/setup-php@v2
with:
php-version: ${{ matrix.php-versions }}
- name: Get Composer Cache Directory
id: composer-cache
run: |
echo "::set-output name=dir::$(composer config cache-files-dir)"
- uses: actions/[email protected]
with:
path: ${{ steps.composer-cache.outputs.dir }}
key: ${{ runner.os }}-composer-${{ hashFiles('**/composer.lock') }}
restore-keys: |
${{ runner.os }}-composer-
- name: git config (required for PluginsTest)
run: git config --global user.email [email protected] && git config --global user.name you
- name: Run lints and tests
run: ./script/test
|
name: Tests & Linting
on: push
jobs:
run:
runs-on: ubuntu-20.04
strategy:
matrix:
php-versions: ['7.2', '7.3', '7.4']
fail-fast: false
name: PHP ${{ matrix.php-versions }}
steps:
- name: Checkout
uses: actions/[email protected]
- name: Setup PHP
uses: shivammathur/setup-php@v2
with:
php-version: ${{ matrix.php-versions }}
- name: Get Composer Cache Directory
id: composer-cache
run: |
echo "::set-output name=dir::$(composer config cache-files-dir)"
- uses: actions/[email protected]
with:
path: ${{ steps.composer-cache.outputs.dir }}
key: ${{ runner.os }}-composer-${{ hashFiles('**/composer.lock') }}
restore-keys: |
${{ runner.os }}-composer-
- name: git config (required for PluginsTest)
run: git config --global user.email [email protected] && git config --global user.name you
- name: Run lints and tests
run: ./script/test
|
FIX indentation in GitHub actions config
|
FIX indentation in GitHub actions config
|
YAML
|
mit
|
dxw/whippet,dxw/whippet,dxw/whippet
|
d32fc43c89ece844d909110794eb0fbfc2ccaade
|
.github/workflows/ruby.yml
|
.github/workflows/ruby.yml
|
name: CI
on: [push]
jobs:
build:
runs-on: ${{ matrix.os }}
strategy:
matrix:
os: [ubuntu-latest, windows-latest]
ruby: [ '2.3.x', '2.4.x', '2.5.x', '2.6.x' ]
fail-fast: false
name: Ruby ${{ matrix.ruby }} ${{ matrix.os }}
steps:
- uses: actions/checkout@v1
- name: Set up Ruby ${{ matrix.ruby }}
uses: actions/setup-ruby@v1
with:
ruby-version: ${{ matrix.ruby }}
- name: Build and test with Rake
run: |
gem install bundler
bundle install --jobs 4 --retry 3
bundle exec rake
|
name: CI
on: [push]
jobs:
build:
runs-on: ${{ matrix.os }}
strategy:
matrix:
os: [ubuntu-latest, windows-latest]
ruby: [ '2.4.x', '2.5.x', '2.6.x' ]
fail-fast: false
name: Ruby ${{ matrix.ruby }} ${{ matrix.os }}
steps:
- uses: actions/checkout@v1
- name: Set up Ruby ${{ matrix.ruby }}
uses: actions/setup-ruby@v1
with:
ruby-version: ${{ matrix.ruby }}
- name: Build and test with Rake
run: |
gem install bundler
bundle install --jobs 4 --retry 3
bundle exec rake
|
Remove Ruby 2.3, Windows doesn't have it
|
Remove Ruby 2.3, Windows doesn't have it
|
YAML
|
mit
|
dentarg/dyno_metadata
|
2a231cb8eff78364607dc5c64728963af0e7061e
|
.github/workflows/ruby.yml
|
.github/workflows/ruby.yml
|
# This workflow uses actions that are not certified by GitHub.
# They are provided by a third-party and are governed by
# separate terms of service, privacy policy, and support
# documentation.
# This workflow will download a prebuilt Ruby version, install dependencies and run tests with Rake
# For more information see: https://github.com/marketplace/actions/setup-ruby-jruby-and-truffleruby
name: Ruby
on:
push:
branches: [ master ]
pull_request:
branches: [ master ]
jobs:
test:
continue-on-error: ${{ matrix.ruby-version == 'truffleruby-head' }}
env:
BUNDLER_GEMFILE: gemfiles/Gemfile-edge
runs-on: ubuntu-latest
strategy:
matrix:
ruby-version: ['2.5', '2.6', '2.7', '3.0', 'head', 'truffleruby-head']
steps:
- uses: actions/[email protected]
- name: Set up Ruby
uses: ruby/setup-ruby@v1
with:
ruby-version: ${{ matrix.ruby-version }}
bundler-cache: true # runs 'bundle install' and caches installed gems automatically
- name: Install dependencies
run: bundle install
- name: Run tests
run: bundle exec rake
|
# This workflow uses actions that are not certified by GitHub.
# They are provided by a third-party and are governed by
# separate terms of service, privacy policy, and support
# documentation.
# This workflow will download a prebuilt Ruby version, install dependencies and run tests with Rake
# For more information see: https://github.com/marketplace/actions/setup-ruby-jruby-and-truffleruby
name: Ruby
on:
push:
branches: [ master ]
pull_request:
branches: [ master ]
jobs:
test:
continue-on-error: ${{ matrix.ruby-version == 'truffleruby-head' }}
env:
BUNDLER_GEMFILE: gemfiles/Gemfile-edge
runs-on: ubuntu-latest
strategy:
matrix:
ruby-version: ['2.5', '2.6', '2.7', '3.0', '3.1', 'head', 'truffleruby-head']
steps:
- uses: actions/[email protected]
- name: Set up Ruby
uses: ruby/setup-ruby@v1
with:
ruby-version: ${{ matrix.ruby-version }}
bundler-cache: true # runs 'bundle install' and caches installed gems automatically
- name: Run tests
run: bundle exec rake
|
Add Ruby 3.1 to the CI matrix
|
Add Ruby 3.1 to the CI matrix
Also removes an unneeded bundle install step
|
YAML
|
mit
|
fog/fog-aws,fog/fog-aws,fog/fog-aws
|
aad0896da532dd4ee3aab964f8f296fddeb56c35
|
.github/workflows/test.yml
|
.github/workflows/test.yml
|
name: Test
on:
push:
branches:
- master
- release/*
pull_request:
branches:
- master
- release/*
jobs:
run:
runs-on: ubuntu-latest
strategy:
matrix:
php: ['7.4']
name: PHP ${{ matrix.php }}
steps:
- name: Checkout
uses: actions/checkout@v2
# https://github.com/shivammathur/setup-php
- name: Setup
uses: shivammathur/setup-php@v2
with:
php-version: ${{ matrix.php }}
coverage: xdebug
- name: Validate
run: composer validate
- name: Install
run: composer install --prefer-dist --no-progress
- name: Test
run: composer run-script test
|
name: Test
on:
push:
branches:
- master
- release/*
pull_request:
branches:
- master
- release/*
jobs:
run:
runs-on: ubuntu-latest
strategy:
matrix:
php: ['7.4', '8.0']
name: PHP ${{ matrix.php }}
steps:
- name: Checkout
uses: actions/checkout@v2
# https://github.com/shivammathur/setup-php
- name: Setup
uses: shivammathur/setup-php@v2
with:
php-version: ${{ matrix.php }}
coverage: xdebug
- name: Validate
run: composer validate
- name: Install
run: composer install --prefer-dist --no-progress
- name: Test
run: composer run-script test
|
Enable PHP 8.0 in GitHub actions
|
Enable PHP 8.0 in GitHub actions
|
YAML
|
mit
|
zicht/itertools,zicht/itertools
|
24cf6479c8b3cd0c8b17974e85f013dd1efe6975
|
templates/install/ansible/roles/address_controller_multitenant/tasks/main.yml
|
templates/install/ansible/roles/address_controller_multitenant/tasks/main.yml
|
---
- name: Ensure user has cluster admin access
shell: oc policy can-i create clusterroles
register: is_admin
failed_when: is_admin.stdout == "no"
- name: Create cluster wide roles used by enmasse-admin service account
shell: oc apply -n {{ namespace }} -f {{ playbook_dir }}/templates/cluster-roles.yaml
- name: Grant cluster admin privileges to service account
shell: oc adm policy add-cluster-role-to-user enmasse-namespace-admin system:serviceaccount:{{ namespace }}:enmasse-admin
|
---
- name: Ensure user has cluster admin access
shell: oc policy can-i create clusterroles
register: is_admin
failed_when: is_admin.stdout == "no"
- name: Create cluster wide roles used by enmasse-admin service account
shell: oc apply -n {{ namespace }} -f {{ playbook_dir }}/templates/cluster-roles.yaml
- name: Grant cluster admin privileges to service account
shell: oc adm policy add-cluster-role-to-user enmasse-namespace-admin system:serviceaccount:{{ namespace }}:enmasse-admin
- name: Grant view policy to default SA
shell: oc policy add-role-to-user view system:serviceaccount:{{ namespace }}:default
- name: Grant admin policy to enmasse-admin
shell: oc policy add-role-to-user admin system:serviceaccount:{{ namespace }}:enmasse-admin
|
Add missing role grants for multitenant address controller
|
Add missing role grants for multitenant address controller
|
YAML
|
apache-2.0
|
jenmalloy/enmasse,EnMasseProject/enmasse,EnMasseProject/enmasse,EnMasseProject/enmasse,jenmalloy/enmasse,jenmalloy/enmasse,jenmalloy/enmasse,jenmalloy/enmasse,EnMasseProject/enmasse,EnMasseProject/enmasse,jenmalloy/enmasse,jenmalloy/enmasse,EnMasseProject/enmasse,EnMasseProject/enmasse
|
d0333f1e85efe7f12519c6167063dc577aba960a
|
packages/ea/earcut.yaml
|
packages/ea/earcut.yaml
|
homepage: https://github.com/reanimate/earcut
changelog-type: markdown
hash: 275cfb46de72c157ee7ede4c9e57ebf1919a3d7c2449c64326311bfb6e6f7b9b
test-bench-deps: {}
maintainer: [email protected]
synopsis: Binding to C++ earcut library.
changelog: |
# Revision history for earcut
## 0.1.0.0 -- YYYY-mm-dd
* First version. Released on an unsuspecting world.
basic-deps:
base: ^>=4.12.0.0
vector: -any
all-versions:
- 0.1.0.0
- 0.1.0.1
author: David Himmelstrup
latest: 0.1.0.1
description-type: markdown
description: |
# earcut
Haskell binding to C++ earcut implementation.
license-name: ISC
|
homepage: https://github.com/reanimate/earcut
changelog-type: markdown
hash: e45d12697d838b423ef886de610feb2ba18a3ccf2b055d9c6d975e8aa68f860b
test-bench-deps: {}
maintainer: [email protected]
synopsis: Binding to C++ earcut library.
changelog: |
# Revision history for earcut
## 0.1.0.0 -- YYYY-mm-dd
* First version. Released on an unsuspecting world.
basic-deps:
base: '>=4.5 && <4.15'
vector: -any
all-versions:
- 0.1.0.0
- 0.1.0.1
- 0.1.0.2
author: David Himmelstrup
latest: 0.1.0.2
description-type: markdown
description: |
# earcut
Haskell binding to C++ earcut implementation.
license-name: ISC
|
Update from Hackage at 2020-05-09T07:51:11Z
|
Update from Hackage at 2020-05-09T07:51:11Z
|
YAML
|
mit
|
commercialhaskell/all-cabal-metadata
|
a38be58065975f810ce89c9adef638f1fe142064
|
setup.yml
|
setup.yml
|
---
- name: playbook to set up FiPy
hosts: local
connection: local
vars:
local_home: "{{ lookup('env', 'HOME') }}"
anaconda_path: "{{ local_home }}/anaconda"
fipy_path: "{{ local_home}}/git/fipy "
tasks:
- name: download miniconda
get_url:
url: https://repo.continuum.io/miniconda/Miniconda-latest-Linux-x86_64.sh
dest: /tmp
force: no
- name: install bzip2
apt: name=bzip2 state=present
become: true
- name: install miniconda
shell: bash /tmp/Miniconda-latest-Linux-x86_64.sh -b -p {{ anaconda_path }}
args:
creates: "{{ anaconda_path }}"
- name: update conda
shell: "{{ anaconda_path }}/bin/conda update conda"
- name: install fipy requirements
shell: "{{ anaconda_path }}/bin/conda install {{ item }}"
with_items:
- numpy==1.9
- scipy
- pip
- name: install fipy requirements guyer channel
shell: "{{ anaconda_path }}/bin/conda install --channel guyer {{ item }}"
with_items:
- gmsh
- name: install scikit-fmm
pip:
name: scikit-fmm
executable: "{{ anaconda_path }}/bin"
- name: clone fipy
git:
repo: https://github.com/usnistgov/fipy.git
dest: "{{ fipy_dir }}"
version: develop
|
---
- name: playbook to set up FiPy
hosts: local
connection: local
vars:
local_home: "{{ lookup('env', 'HOME') }}"
anaconda_path: "{{ local_home }}/anaconda"
fipy_path: "{{ local_home}}/git/fipy "
tasks:
- name: apt-get installs
apt:
name: "{{ item }}"
state: present
become: true
with_items:
- bzip2
- g++
- name: download miniconda
get_url:
url: https://repo.continuum.io/miniconda/Miniconda-latest-Linux-x86_64.sh
dest: /tmp
force: no
- name: install miniconda
shell: bash /tmp/Miniconda-latest-Linux-x86_64.sh -b -p {{ anaconda_path }}
args:
creates: "{{ anaconda_path }}"
- name: update conda
shell: "{{ anaconda_path }}/bin/conda update conda"
- name: install fipy requirements
shell: "{{ anaconda_path }}/bin/conda install {{ item }}"
with_items:
- numpy==1.9
- scipy
- pip
- name: install fipy requirements from guyer channel
shell: "{{ anaconda_path }}/bin/conda install --channel guyer {{ item }}"
with_items:
- gmsh
- name: install scikit-fmm
pip:
name: scikit-fmm
executable: "{{ anaconda_path }}/bin/pip"
- name: clone fipy
git:
repo: https://github.com/usnistgov/fipy.git
dest: "{{ fipy_dir }}"
version: develop
|
Add g++ prerequisite for skfmm
|
Add g++ prerequisite for skfmm
|
YAML
|
mit
|
wd15/fipy-dockerize,wd15/mdc-dockerize
|
b593da6228d9700c6c4720c796ac9eb56e7b0704
|
_data/main.yml
|
_data/main.yml
|
sections:
- id: backup
title: Backup and Sync
icon: hdd
- id: banking
title: Banking
icon: dollar
- id: cloud
title: Cloud Computing
icon: cloud
- id: communication
title: Communication
icon: chat
- id: bitcoin
title: Cryptocurrencies
icon: bitcoin
- id: developer
title: Developer
icon: code
- id: domains
title: Domains
icon: globe
- id: education
title: Education
icon: book
- id: email
title: Email
icon: mail
- id: entertainment
title: Entertainment
icon: music
- id: finance
title: Finance
icon: money
- id: gaming
title: Gaming
icon: gamepad
- id: health
title: Health
icon: medkit
- id: hosting
title: Hosting/VPS
icon: sitemap
- id: identity
title: Identity Management
icon: basic id
- id: investing
title: Investing
icon: basic money
- id: other
title: Other
icon: lab
- id: payments
title: Payments
icon: payment
- id: remote
title: Remote Access
icon: desktop
- id: retail
title: Retail
icon: cart
- id: security
title: Security
icon: lock
- id: social
title: Social
icon: users
- id: transport
title: Transport
icon: car
- id: utilities
title: Utilities
icon: phone
|
sections:
- id: backup
title: Backup and Sync
icon: disk outline
- id: banking
title: Banking
icon: dollar
- id: cloud
title: Cloud Computing
icon: cloud
- id: communication
title: Communication
icon: chat
- id: bitcoin
title: Cryptocurrencies
icon: bitcoin
- id: developer
title: Developer
icon: code
- id: domains
title: Domains
icon: globe
- id: education
title: Education
icon: book
- id: email
title: Email
icon: mail
- id: entertainment
title: Entertainment
icon: music
- id: finance
title: Finance
icon: money
- id: gaming
title: Gaming
icon: gamepad
- id: health
title: Health
icon: medkit
- id: hosting
title: Hosting/VPS
icon: sitemap
- id: identity
title: Identity Management
icon: user
- id: investing
title: Investing
icon: basic money
- id: other
title: Other
icon: lab
- id: payments
title: Payments
icon: payment
- id: remote
title: Remote Access
icon: desktop
- id: retail
title: Retail
icon: cart
- id: security
title: Security
icon: lock
- id: social
title: Social
icon: users
- id: transport
title: Transport
icon: car
- id: utilities
title: Utilities
icon: phone
|
Add backup & identity manager logo
|
Add backup & identity manager logo
|
YAML
|
mit
|
vgarmash/twofactorauth,jdavis/twofactorauth,hilliuse/twofactorauth,Jawshy/twofactorauth,RichJeanes/twofactorauth,RichJeanes/twofactorauth,marcryan/twofactorauth,jtamplin/twofactorauth,seitsu/twofactorauth,NormandLemay/twofactorauth,mark-adams/twofactorauth,baileybasiscode/twofactorauth,rugk/twofactorauth,rugk/twofactorauth,Criminalllz/twofactorauth,RichJeanes/twofactorauth,ilyakatz/twofactorauth,runasand/twofactorauth,gingerbeardman/twofactorauth,Nitrokey/dongleauth,gingerbeardman/twofactorauth,hilliuse/twofactorauth,jdavis/twofactorauth,Criminalllz/twofactorauth,Nanotraktor/twofactorauth,gingerbeardman/twofactorauth,vgarmash/twofactorauth,cez81/twofactorauth,bceasydns/twofactorauth,baileybasiscode/twofactorauth,ahelpingchip/twofactorauth,veratu/twofactorauth,bjorno/twofactorauth,rugk/twofactorauth,Nanotraktor/twofactorauth,kojaxs/twofactorauth,jdavis/twofactorauth,rj175/twofactorauth,seitsu/twofactorauth,veratu/twofactorauth,hilliuse/twofactorauth,kojaxs/twofactorauth,veratu/twofactorauth,Jawshy/twofactorauth,brierjon/twofactorauth,ilyakatz/twofactorauth,jtamplin/twofactorauth,bjorno/twofactorauth,Nanotraktor/twofactorauth,Criminalllz/twofactorauth,jtamplin/twofactorauth,rj175/twofactorauth,bjorno/twofactorauth,cez81/twofactorauth,kojaxs/twofactorauth,stephengroat/twofactorauth,baileybasiscode/twofactorauth,bceasydns/twofactorauth,brierjon/twofactorauth,rj175/twofactorauth,Nitrokey/dongleauth,stephengroat/twofactorauth,runasand/twofactorauth,NormandLemay/twofactorauth,cez81/twofactorauth,stephengroat/twofactorauth,marcryan/twofactorauth,Nitrokey/dongleauth,Jawshy/twofactorauth,runasand/twofactorauth,bceasydns/twofactorauth,seitsu/twofactorauth,mark-adams/twofactorauth,vgarmash/twofactorauth,mark-adams/twofactorauth,NormandLemay/twofactorauth,ilyakatz/twofactorauth,ahelpingchip/twofactorauth,brierjon/twofactorauth,ahelpingchip/twofactorauth,marcryan/twofactorauth,Jawshy/twofactorauth
|
26f43ee3fd5997524af1cbc6db191b92638cf763
|
config/mongoid.yml
|
config/mongoid.yml
|
development:
clients:
default:
# MONGODB_URI includes draft_content_store_development or content_store_development
# depending on whether we're running content store in draft mode or not.
uri: <%= ENV['MONGODB_URI'] || 'mongodb://localhost/content_store_development' %>
options:
write:
w: 1
test:
clients:
default:
uri: <%= ENV['TEST_MONGODB_URI'] || 'mongodb://localhost/content_store_test' %>
options:
write:
w: 1
# In the test environment we lower the retries and retry interval to
# low amounts for fast failures.
max_retries: 1
retry_interval: 0
production:
clients:
default:
uri: <%= ENV['MONGODB_URI'] %>
options:
ssl: <%= ENV['MONGO_SSL'] || 'false' %>
ssl_verify: <%= ENV['MONGO_SSL_VERIFY'] || 'true' %>
server_selection_timeout: 5
write:
w: <%= ENV['MONGO_WRITE_CONCERN'] || 'majority' %>
read:
mode: :secondary_preferred
|
development:
clients:
default:
# MONGODB_URI includes draft_content_store_development or content_store_development
# depending on whether we're running content store in draft mode or not.
uri: <%= ENV['MONGODB_URI'] || 'mongodb://localhost/content_store_development' %>
options:
write:
w: 1
test:
clients:
default:
uri: <%= ENV['TEST_MONGODB_URI'] || 'mongodb://localhost/content_store_test' %>
options:
write:
w: 1
# In the test environment we lower the retries and retry interval to
# low amounts for fast failures.
max_retries: 1
retry_interval: 0
production:
clients:
default:
uri: <%= ENV['MONGODB_URI'] %>
options:
ssl: <%= ENV['MONGO_SSL'] || 'false' %>
ssl_verify: <%= ENV['MONGO_SSL_VERIFY'] || 'true' %>
server_selection_timeout: 5
write:
w: <%= ENV['MONGO_WRITE_CONCERN'] || 'majority' %>
read:
mode: :nearest
|
Use the `nearest` read mode for MongoDB
|
Use the `nearest` read mode for MongoDB
The Content Store is a read heavy application, as it only processes
writes when things change, but it's constantly serving the frontend
applications with data.
This can be seen in the load on the MongoDB machines, currently the
primary gets off lightly with a low load, and the secondary machines
handle most of the operations.
Using the nearest MongoDB machine should spread the load more evenly,
and might even increase performance due to reduced network latency.
I've been looking at this, as the MongoDB machines often complain
about high load (especially on the AWS side of Staging).
|
YAML
|
mit
|
alphagov/content-store,alphagov/content-store
|
d6f77ca2e99b512f5b9e1e212fc28efe4edd55c1
|
metadata/com.dev.xavier.tempusromanum.yml
|
metadata/com.dev.xavier.tempusromanum.yml
|
Categories:
- Time
License: Apache-2.0
SourceCode: https://github.com/xavierfreyburger/tempus-romanum
IssueTracker: https://github.com/xavierfreyburger/tempus-romanum/issues
Donate: https://www.paypal.me/XavierFreyburger
AutoName: Tempus Romanum
Description: |-
Tempus Romanum provides a simple tool to generate dates in latin format.
You can also add a widget to see the date of the day in latin on your homescreen.
RepoType: git
Repo: https://github.com/xavierfreyburger/tempus-romanum
Builds:
- versionName: '1.0'
versionCode: 1
commit: v1.0
subdir: app
gradle:
- yes
- versionName: '2.0'
versionCode: 20
commit: v2.0
subdir: app
gradle:
- yes
- versionName: '2.1'
versionCode: 21
commit: v2.1
subdir: app
gradle:
- yes
- versionName: 2.1.1
versionCode: 211
commit: v2.1.1
subdir: app
gradle:
- yes
AutoUpdateMode: Version v%v
UpdateCheckMode: Tags
CurrentVersion: 2.1.1
CurrentVersionCode: 211
|
Categories:
- Time
License: Apache-2.0
SourceCode: https://github.com/xavierfreyburger/tempus-romanum
IssueTracker: https://github.com/xavierfreyburger/tempus-romanum/issues
Donate: https://www.paypal.me/XavierFreyburger
AutoName: Tempus Romanum
Description: |-
Tempus Romanum provides a simple tool to generate dates in latin format.
You can also add a widget to see the date of the day in latin on your homescreen.
RepoType: git
Repo: https://github.com/xavierfreyburger/tempus-romanum
Builds:
- versionName: '1.0'
versionCode: 1
commit: v1.0
subdir: app
gradle:
- yes
- versionName: '2.0'
versionCode: 20
commit: v2.0
subdir: app
gradle:
- yes
- versionName: '2.1'
versionCode: 21
commit: v2.1
subdir: app
gradle:
- yes
- versionName: 2.1.1
versionCode: 211
commit: v2.1.1
subdir: app
gradle:
- yes
- versionName: 2.2.0
versionCode: 220
commit: v2.2.0
subdir: app
gradle:
- yes
AutoUpdateMode: Version v%v
UpdateCheckMode: Tags
CurrentVersion: 2.2.0
CurrentVersionCode: 220
|
Update Tempus Romanum to 2.2.0 (220)
|
Update Tempus Romanum to 2.2.0 (220)
|
YAML
|
agpl-3.0
|
f-droid/fdroiddata,f-droid/fdroiddata
|
88143cf71aeb141e8dc2775e49c5df0863134da3
|
metadata/me.hackerchick.raisetoanswer.yml
|
metadata/me.hackerchick.raisetoanswer.yml
|
Categories:
- Phone & SMS
License: MIT
SourceCode: https://github.com/TheLastProject/RaiseToAnswer
IssueTracker: https://github.com/TheLastProject/RaiseToAnswer/issues
AutoName: Raise To Answer
RepoType: git
Repo: https://github.com/TheLastProject/RaiseToAnswer.git
Builds:
- versionName: 1.3.4
versionCode: 9
commit: v1.3.4
subdir: app
gradle:
- yes
- versionName: 1.3.5
versionCode: 10
commit: v1.3.5
subdir: app
gradle:
- yes
AutoUpdateMode: Version v%v
UpdateCheckMode: Tags
CurrentVersion: 1.3.5
CurrentVersionCode: 10
|
Categories:
- Phone & SMS
License: MIT
SourceCode: https://github.com/TheLastProject/RaiseToAnswer
IssueTracker: https://github.com/TheLastProject/RaiseToAnswer/issues
AutoName: Raise To Answer
RepoType: git
Repo: https://github.com/TheLastProject/RaiseToAnswer.git
Builds:
- versionName: 1.3.4
versionCode: 9
commit: v1.3.4
subdir: app
gradle:
- yes
- versionName: 1.3.5
versionCode: 10
commit: v1.3.5
subdir: app
gradle:
- yes
- versionName: 2.0.1
versionCode: 12
commit: v2.0.1
subdir: app
gradle:
- yes
AutoUpdateMode: Version v%v
UpdateCheckMode: Tags
CurrentVersion: 2.0.1
CurrentVersionCode: 12
|
Update Raise To Answer to 2.0.1 (12)
|
Update Raise To Answer to 2.0.1 (12)
|
YAML
|
agpl-3.0
|
f-droid/fdroiddata,f-droid/fdroiddata
|
ce39dbac56123354576d2c31674e1b18535b0111
|
environments/neutron-ovs-dvr.yaml
|
environments/neutron-ovs-dvr.yaml
|
# A Heat environment file that enables DVR in the overcloud.
# This works by configuring L3 and Metadata agents on the
# compute nodes.
resource_registry:
OS::TripleO::Services::ComputeNeutronL3Agent: ../puppet/services/neutron-l3-compute-dvr.yaml
OS::TripleO::Services::ComputeNeutronMetadataAgent: ../puppet/services/neutron-metadata.yaml
# With DVR enabled, the Compute nodes also need the br-ex bridge to be
# connected to a physical network.
OS::TripleO::Compute::Net::SoftwareConfig: ../net-config-bridge.yaml
parameter_defaults:
# DVR requires that the L2 population feature is enabled
NeutronMechanismDrivers: ['openvswitch', 'l2population']
NeutronEnableL2Pop: 'True'
# Setting NeutronEnableDVR enables distributed routing support in the
# ML2 plugin and agents that support this feature
NeutronEnableDVR: true
# We also need to set the proper agent mode for the L3 agent. This will only
# affect the agent on the controller node.
NeutronL3AgentMode: 'dvr_snat'
|
# A Heat environment file that enables DVR in the overcloud.
# This works by configuring L3 and Metadata agents on the
# compute nodes.
resource_registry:
OS::TripleO::Services::ComputeNeutronL3Agent: ../puppet/services/neutron-l3-compute-dvr.yaml
OS::TripleO::Services::ComputeNeutronMetadataAgent: ../puppet/services/neutron-metadata.yaml
# With DVR enabled, the Compute nodes also need the br-ex bridge to be
# connected to a physical network.
OS::TripleO::Compute::Net::SoftwareConfig: ../net-config-bridge.yaml
parameter_defaults:
# DVR requires that the L2 population feature is enabled
NeutronMechanismDrivers: ['openvswitch', 'l2population']
NeutronEnableL2Pop: 'True'
# Setting NeutronEnableDVR enables distributed routing support in the
# ML2 plugin and agents that support this feature
NeutronEnableDVR: true
# We also need to set the proper agent mode for the L3 agent. This will only
# affect the agent on the controller node.
NeutronL3AgentMode: 'dvr_snat'
# L3 HA isn't supported for DVR enabled routers. If upgrading from a system
# where L3 HA is enabled and has neutron routers configured, it is
# recommended setting this value to true until such time all routers can be
# migrated to DVR routers. Once migration of the routers is complete,
# NeutronL3HA can be returned to false. All new systems should be deployed
# with NeutronL3HA set to false.
NeutronL3HA: false
|
Set NeutronL3HA to false when deploying DVR
|
Set NeutronL3HA to false when deploying DVR
This patch sets NeutronL3HA to false when the DVR environment is
included on deployment. L3 HA isn't supported with DVR routers and is
disabled by defaults in our templates but upgrading users might have it
enabled in their existing overclouds. Including the setting here with
an appropriate comment gives us some sort of mechanism to convey to the
upgrading user what they need to do. This will have no effect on users
not using L3 HA or deploying new overclouds.
Change-Id: If036ae2b88225f5a7d5f295eb5e6874d2fbb3f72
|
YAML
|
apache-2.0
|
trozet/opnfv-tht,openstack/tripleo-heat-templates,openstack/tripleo-heat-templates,dprince/tripleo-heat-templates,dprince/tripleo-heat-templates,trozet/opnfv-tht,trozet/opnfv-tht
|
d8fdef63ac6a68f85f4cbb881b323d63462b8745
|
.pyup.yml
|
.pyup.yml
|
# PyUp config
# https://pyup.io/docs/bot/config/
# Check dependencies in _only_ requirements-app and requirements-dev, and open PRs with PyUp prefix.
search: False
schedule: "every week on sunday"
requirements:
- requirements-app.txt:
update: all
pin: True
- requirements-dev.txt:
update: all
pin: True
- requirements.txt:
update: False
pr_prefix: "PyUp - "
|
# PyUp config
# https://pyup.io/docs/bot/config/
# Check dependencies in _only_ requirements-app, and open PRs with PyUp prefix.
search: False
schedule: "every week on sunday"
requirements:
- requirements-app.txt:
update: all
pin: True
- requirements-dev.txt:
update: False
pin: True
- requirements.txt:
update: False
pr_prefix: "PyUp - "
|
Remove test requirements from PyUp
|
Remove test requirements from PyUp
|
YAML
|
mit
|
alphagov/digitalmarketplace-search-api,alphagov/digitalmarketplace-search-api
|
67e5ad1953971e11e33fa9e41b737a66134c663d
|
.pyup.yml
|
.pyup.yml
|
# PyUp config
# https://pyup.io/docs/bot/config/
# Check dependencies in _only_ requirements-app, and open PRs with PyUp prefix.
search: False
schedule: "every week on sunday"
requirements:
- requirements-app.txt:
update: all
pin: True
- requirements-dev.txt:
update: False
pin: True
- requirements.txt:
update: False
pr_prefix: "PyUp - "
|
# PyUp config
# https://pyup.io/docs/bot/config/
# Check dependencies in _only_ requirements-app, and open PRs with PyUp prefix.
search: False
schedule: "every week on sunday"
requirements:
- requirements.in:
update: all
pin: True
- requirements-dev.in:
update: False
pin: True
- requirements.txt:
update: False
- requirements-dev.txt:
update: False
pr_prefix: "PyUp - "
|
Update PyUp configuration for pip-tools
|
Update PyUp configuration for pip-tools
|
YAML
|
mit
|
alphagov/digitalmarketplace-buyer-frontend,alphagov/digitalmarketplace-buyer-frontend,alphagov/digitalmarketplace-buyer-frontend,alphagov/digitalmarketplace-buyer-frontend
|
ff7a2200af5fb65b68a8e149c8fc0e8188d9b0ca
|
.zuul.yml
|
.zuul.yml
|
name: Ahoy.js
ui: tape
html: test/template.html
scripts:
- dist/ahoy.js
- test/test_helper.js
browsers:
- name: chrome
version: latest
- name: safari
version: latest
- name: ie
version: latest
|
name: Ahoy.js
ui: tape
html: test/template.html
scripts:
- dist/ahoy.js
- test/test_helper.js
browsers:
- name: chrome
version: latest
- name: safari
version: latest
# - name: ie
# version: latest
|
Disable IE tests for now
|
Disable IE tests for now
|
YAML
|
mit
|
ankane/ahoy.js,ankane/ahoy.js
|
05532e1ac7ccb377070f611848cbd46e6b78a24b
|
ci/config.yaml
|
ci/config.yaml
|
checks:
required_workflows:
main.yaml: False
rebuild.yaml: False
versions:
rebuild: False
codespell:
ignore_re:
- ^contribs/gmf/test/spec/data/themescapabilities\.js$
- ^contribs/gmf/test/spec/data/themes\.js$
- ^buildtools/asitvd.capabilities\.xml$
- ^examples/measure\.js$
- ^src/bootstrap-custom\.css\.map$
dependabot_config:
update_ignore:
- directory: /
package-ecosystem: npm
audit:
npm:
package_ignore:
- webpack-dev-server
- typedoc
- node-sass
- webpack-cli
- angular-gettext-tools
- '@storybook/core'
|
# yaml-language-server: $schema=https://raw.githubusercontent.com/camptocamp/c2cciutils/master/c2cciutils/schema.json
checks:
required_workflows:
main.yaml: False
rebuild.yaml: False
versions:
rebuild: False
codespell:
ignore_re:
- ^contribs/gmf/test/spec/data/themescapabilities\.js$
- ^contribs/gmf/test/spec/data/themes\.js$
- ^buildtools/asitvd.capabilities\.xml$
- ^examples/measure\.js$
- ^src/bootstrap-custom\.css\.map$
dependabot_config:
update_ignore:
- directory: /
package-ecosystem: npm
audit:
npm:
package_ignore:
- webpack-dev-server
- typedoc
- node-sass
- webpack-cli
- angular-gettext-tools
- '@storybook/core'
- webpack
- i18next-parser
- '@storybook/web-components'
|
Add package on witch one we will ignore the CVE
|
Add package on witch one we will ignore the CVE
|
YAML
|
mit
|
camptocamp/ngeo,camptocamp/ngeo,camptocamp/ngeo,camptocamp/ngeo,camptocamp/ngeo
|
c4d79899bf93d1400680a9de1d5780f36eb01c2b
|
metadata/de.spiritcroc.darkcroc.substratum.yml
|
metadata/de.spiritcroc.darkcroc.substratum.yml
|
Categories:
- Theming
License: Apache-2.0
SourceCode: https://github.com/SpiritCroc/DarkCroc-Android-theme
IssueTracker: https://github.com/SpiritCroc/DarkCroc-Android-theme/issues
Changelog: https://github.com/SpiritCroc/DarkCroc-Android-theme/releases
Summary: A dark Substratum theme targeting Android 9+
Description: |-
'''IMPORTANT:''' You need the Substratum theming platform working on your
device! Read the inbuilt information before reporting any issues!
Theme for Substratum, originally based on the default AOSP dark material
colors; then extended to provide more variants to allow configuration
according to the user's taste.
This is the successor theme of [[de.spiritcroc.defaultdarktheme_oms]] series
for newer Android versions, starting with Android 9.
RepoType: git
Repo: https://github.com/SpiritCroc/DarkCroc-Android-theme
Builds:
- versionName: '1.1'
versionCode: 2
commit: v1.1
subdir: app
gradle:
- yes
prebuild: |-
sed -i -e 's/\"generate_/\"\.\.\/generate_/g' build.gradle
AutoUpdateMode: Version v%v
UpdateCheckMode: Tags
CurrentVersion: '1.1'
CurrentVersionCode: 2
|
Categories:
- Theming
License: Apache-2.0
SourceCode: https://github.com/SpiritCroc/DarkCroc-Android-theme
IssueTracker: https://github.com/SpiritCroc/DarkCroc-Android-theme/issues
Changelog: https://github.com/SpiritCroc/DarkCroc-Android-theme/releases
AutoName: DarkCroc Theme
Summary: A dark Substratum theme targeting Android 9+
Description: |-
'''IMPORTANT:''' You need the Substratum theming platform working on your
device! Read the inbuilt information before reporting any issues!
Theme for Substratum, originally based on the default AOSP dark material
colors; then extended to provide more variants to allow configuration
according to the user's taste.
This is the successor theme of [[de.spiritcroc.defaultdarktheme_oms]] series
for newer Android versions, starting with Android 9.
RepoType: git
Repo: https://github.com/SpiritCroc/DarkCroc-Android-theme
Builds:
- versionName: '1.1'
versionCode: 2
commit: v1.1
subdir: app
gradle:
- yes
prebuild: |-
sed -i -e 's/\"generate_/\"\.\.\/generate_/g' build.gradle
AutoUpdateMode: Version v%v
UpdateCheckMode: Tags
CurrentVersion: '1.1'
CurrentVersionCode: 2
|
Set autoname of DarkCroc Theme
|
Set autoname of DarkCroc Theme
|
YAML
|
agpl-3.0
|
f-droid/fdroid-data,f-droid/fdroiddata,f-droid/fdroiddata
|
005225cd1f78462c13d43ac2c1ea067d0d2c0d55
|
site/en/performance/_toc.yaml
|
site/en/performance/_toc.yaml
|
toc:
- title: Performance
path: /performance/
- title: Performance guide
path: /performance/performance_guide
- title: Input pipeline performance guide
path: /performance/datasets_performance
- title: Benchmarks
path: /performance/benchmarks
- heading: Model optimization toolkit
- title: Model optimization
path: /performance/model_optimization
- title: Quantization-aware training
path: /performance/quantization_training
- title: Post-training quantization
path: /performance/post_training_quantization
- title: Post-training quantization example
path: https://github.com/tensorflow/tensorflow/blob/master/tensorflow/contrib/lite/tutorials/post_training_quant.ipynb
status: external
- heading: XLA
- title: XLA overview
path: /performance/xla/
- title: Broadcasting semantics
path: /performance/xla/broadcasting
- title: Developing a new backend for XLA
path: /performance/xla/developing_new_backend
- title: Using JIT compilation
path: /performance/xla/jit
- title: Operation semantics
path: /performance/xla/operation_semantics
- title: Shapes and layout
path: /performance/xla/shapes
- title: Using AOT compilation
path: /performance/xla/tfcompile
|
toc:
- title: Performance
path: /performance/
- title: Performance guide
path: /performance/performance_guide
- title: Input pipeline performance guide
path: /performance/datasets_performance
- title: Benchmarks
path: /performance/benchmarks
- heading: Model optimization toolkit
- title: Model optimization
path: /performance/model_optimization
- title: Post-training quantization
path: /performance/post_training_quantization
- title: Post-training quantization example
path: https://github.com/tensorflow/tensorflow/blob/master/tensorflow/contrib/lite/tutorials/post_training_quant.ipynb
status: external
- heading: XLA
- title: XLA overview
path: /performance/xla/
- title: Broadcasting semantics
path: /performance/xla/broadcasting
- title: Developing a new backend for XLA
path: /performance/xla/developing_new_backend
- title: Using JIT compilation
path: /performance/xla/jit
- title: Operation semantics
path: /performance/xla/operation_semantics
- title: Shapes and layout
path: /performance/xla/shapes
- title: Using AOT compilation
path: /performance/xla/tfcompile
|
Remove navigation link for quantization aware training.
|
Remove navigation link for quantization aware training.
|
YAML
|
apache-2.0
|
tensorflow/docs,tensorflow/docs-l10n,tensorflow/docs,tensorflow/docs
|
042858d98a77a8b3875fd00f9bccacf996a47a62
|
vitrage/tests/resources/templates/general/host_high_cpu_load_to_instance_cpu_suboptimal.yaml
|
vitrage/tests/resources/templates/general/host_high_cpu_load_to_instance_cpu_suboptimal.yaml
|
metadata:
name: host_high_cpu_load_to_instance_cpu_suboptimal
definitions:
entities:
- entity:
category: ALARM
type: HOST_HIGH_CPU_LOAD
template_id: alarm1
- entity:
category: ALARM
type: VM_CPU_SUBOPTIMAL_PERFORMANCE
template_id: alarm2
- entity:
category: RESOURCE
type: nova.host
template_id: host
- entity:
category: RESOURCE
type: nova.instance
template_id: instance
relationships:
- relationship:
source: alarm1
target: host
relationship_type: on
template_id : alarm_on_host
- relationship:
source: alarm2
target: instance
relationship_type: on
template_id : alarm_on_instance
- relationship:
source: host
target: instance
relationship_type: contains
template_id : host_contains_instance
scenarios:
- scenario:
condition: alarm_on_host and host_contains_instance
actions:
- action:
action_type: raise_alarm
properties:
alarm_name: VM_CPU_SUBOPTIMAL_PERFORMANCE
severity: critical
action_target:
target: instance
- action:
action_type: set_state
properties:
state: SUBOPTIMAL
action_target:
target: instance
- scenario:
condition: alarm_on_host and alarm_on_instance and host_contains_instance
actions:
- action:
action_type: add_causal_relationship
action_target:
source: alarm1
target: alarm2
|
metadata:
name: host_high_cpu_load_to_instance_cpu_suboptimal
definitions:
entities:
- entity:
category: ALARM
name: HOST_HIGH_CPU_LOAD
template_id: alarm1
- entity:
category: ALARM
name: VM_CPU_SUBOPTIMAL_PERFORMANCE
template_id: alarm2
- entity:
category: RESOURCE
type: nova.host
template_id: host
- entity:
category: RESOURCE
type: nova.instance
template_id: instance
relationships:
- relationship:
source: alarm1
target: host
relationship_type: on
template_id : alarm_on_host
- relationship:
source: alarm2
target: instance
relationship_type: on
template_id : alarm_on_instance
- relationship:
source: host
target: instance
relationship_type: contains
template_id : host_contains_instance
scenarios:
- scenario:
condition: alarm_on_host and host_contains_instance
actions:
- action:
action_type: raise_alarm
properties:
alarm_name: VM_CPU_SUBOPTIMAL_PERFORMANCE
severity: critical
action_target:
target: instance
- action:
action_type: set_state
properties:
state: SUBOPTIMAL
action_target:
target: instance
- scenario:
condition: alarm_on_host and alarm_on_instance and host_contains_instance
actions:
- action:
action_type: add_causal_relationship
action_target:
source: alarm1
target: alarm2
|
Fix error in tests example
|
Fix error in tests example
Change-Id: I93ec441dcfcab11c08b98c36e2964600a2d4dcf5
|
YAML
|
apache-2.0
|
openstack/vitrage,openstack/vitrage,openstack/vitrage
|
97e5a7e63fa5776358f49490de5c9533a4ed0aa7
|
packages/li/liquidhaskell-cabal-demo.yaml
|
packages/li/liquidhaskell-cabal-demo.yaml
|
homepage: https://github.com/spinda/liquidhaskell-cabal-demo#readme
changelog-type: ''
hash: cd6cfc54e1eb79c01bcb2b4db39e24cd9c6a7b742c03d5004dda9abcf9157d76
test-bench-deps: {}
maintainer: Michael Smith <[email protected]>
synopsis: Demo of Liquid Haskell integration for Cabal and stack
changelog: ''
basic-deps:
base: ! '>=4.8 && <5'
liquidhaskell-cabal: ! '>=0.2'
all-versions:
- 0.1.1.0
- 0.2.0.0
- 0.2.0.1
author: Michael Smith
latest: 0.2.0.1
description-type: haddock
description: |-
Please see the
<https://github.com/spinda/liquidhaskell-cabal-demo/blob/0.2.0.1/README.md README>
on GitHub for more information.
license-name: BSD-3-Clause
|
homepage: https://github.com/spinda/liquidhaskell-cabal-demo#readme
changelog-type: ''
hash: cd6cfc54e1eb79c01bcb2b4db39e24cd9c6a7b742c03d5004dda9abcf9157d76
test-bench-deps: {}
maintainer: Michael Smith <[email protected]>
synopsis: Demo of Liquid Haskell integration for Cabal and stack
changelog: ''
basic-deps:
base: ! '>=4.8 && <5'
liquidhaskell-cabal: ! '>=0.2'
all-versions:
- 0.1.1.0
- 0.2.0.1
author: Michael Smith
latest: 0.2.0.1
description-type: haddock
description: |-
Please see the
<https://github.com/spinda/liquidhaskell-cabal-demo/blob/0.2.0.1/README.md README>
on GitHub for more information.
license-name: BSD-3-Clause
|
Update from Hackage at 2019-04-12T23:36:39Z
|
Update from Hackage at 2019-04-12T23:36:39Z
|
YAML
|
mit
|
commercialhaskell/all-cabal-metadata
|
3a894f71a3017139af1c7d533f4ba9a312a80323
|
osa/playbooks/roles/known_hosts/tasks/main.yml
|
osa/playbooks/roles/known_hosts/tasks/main.yml
|
---
# Copyright 2016 IBM Corp.
#
# All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
- name: get the public key associated with the inventory field ["{{ key }}"] for all nodes
shell: ssh-keyscan -H -t ssh-rsa "{{ hostvars[item][key] }}"
register: ssh_hostkeys
ignore_errors: true
with_inventory_hostnames: all
when: "{{ key in hostvars[item] }}"
#- debug: var=ssh_hostkeys
- name: Add keys to known_hosts file
known_hosts:
path: "/root/.ssh/known_hosts"
name: "{{ item.stderr.split(' ')[1].split(':')[0] }}"
# stderr looks like '# ipaddr/hostname:22 SSH-2.0-...'
key: "{{ item.stdout }}"
when:
- "{{ item.rc is defined }}"
- item.rc == 0
with_items:
- "{{ ssh_hostkeys.results }}"
|
---
# Copyright 2016 IBM Corp.
#
# All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
- name: get the public key associated with the inventory field ["{{ key }}"] for all nodes
shell: ssh-keyscan -H -t ssh-rsa "{{ hostvars[item][key] }}"
register: ssh_hostkeys
ignore_errors: true
with_inventory_hostnames: all
when: "{{ key in hostvars[item] }}"
#- debug: var=ssh_hostkeys
- name: Add keys to known_hosts file
known_hosts:
path: "/root/.ssh/known_hosts"
name: "{{ item.stderr.split(' ')[1].split(':')[0] }}"
# stderr looks like '# ipaddr/hostname:22 SSH-2.0-...'
key: "{{ item.stdout }}"
when:
- "{{ item.rc is defined }}"
- item.rc == 0
- "{{ item.stdout is defined }}"
- item.stdout
with_items:
- "{{ ssh_hostkeys.results }}"
|
Fix Ubuntu 16.04 incompatibility with ssh-keyscan
|
Fix Ubuntu 16.04 incompatibility with ssh-keyscan
On error it returns 0 with stdout empty. On Ubuntu 14.04,
it returned 255 exit status.
Fix is to ensure that stdout is not empty.
Change-Id: Id01717e420128f191541c825a0b97144780514d0
|
YAML
|
apache-2.0
|
open-power-ref-design-toolkit/os-services,open-power-ref-design-toolkit/os-services,open-power-ref-design-toolkit/os-services,open-power-ref-design/os-services,open-power-ref-design/os-services
|
309c58cd153e796230e92834560ecaa08ba78f3e
|
.github/workflows/gradle-build.yml
|
.github/workflows/gradle-build.yml
|
name: "Build Gradle project"
on: [push, workflow_dispatch]
jobs:
build:
runs-on: ubuntu-latest
strategy:
matrix:
java: [ 8, 11, 17 ]
name: Test on Java ${{ matrix.java }}
steps:
- name: Check out project
uses: actions/checkout@v2
- name: Set up JDK 8
uses: actions/setup-java@v1
with:
java-version: 8
- name: Build with Gradle
uses: gradle/gradle-build-action@v1
with:
arguments: build testAll -PtestJavaRuntimeVersion=${{ matrix.java }} -Pgradle-plugindev-plugin.acceptGradleTOS=true
distributions-cache-enabled: true
dependencies-cache-enabled: true
configuration-cache-enabled: false
|
name: "Build Gradle project"
on: [push, workflow_dispatch]
jobs:
build:
runs-on: ubuntu-latest
strategy:
matrix:
java: [ 8, 11, 16 ]
name: Test on Java ${{ matrix.java }}
steps:
- name: Check out project
uses: actions/checkout@v2
- name: Set up JDK 8
uses: actions/setup-java@v1
with:
java-version: 8
- name: Build with Gradle
uses: gradle/gradle-build-action@v1
with:
arguments: build testAll -PtestJavaRuntimeVersion=${{ matrix.java }} -Pgradle-plugindev-plugin.acceptGradleTOS=true
distributions-cache-enabled: true
dependencies-cache-enabled: true
configuration-cache-enabled: false
|
Change from JDK 17 to JDK 16 since 17 is not yet released
|
Change from JDK 17 to JDK 16 since 17 is not yet released
|
YAML
|
apache-2.0
|
etiennestuder/gradle-plugindev-plugin
|
c8a67a8b8ae28aa62e3494c18fb66d41d8c244e3
|
.github/workflows/presubmit-go.yml
|
.github/workflows/presubmit-go.yml
|
name: go-presubmit
on:
push:
branches: ['master']
pull_request:
paths: ['go/**', '.github/workflows/presubmit-go.yml']
defaults:
run:
working-directory: go
jobs:
lint:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: gofmt
run: test -z "$(gofmt -d .)"
- name: golint
run: |
go get -u golang.org/x/lint/golint
$(go env GOPATH)/bin/golint -set_exit_status=1 ./...
unit-test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Run tests
run: go test ./...
|
name: go-presubmit
on:
push:
branches: ['master']
pull_request:
paths: ['go/**', '.github/workflows/presubmit-go.yml']
defaults:
run:
working-directory: go
jobs:
lint:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: gofmt
run: test -z "$(gofmt -d . | tee >&2)"
- name: golint
run: |
go get -u golang.org/x/lint/golint
$(go env GOPATH)/bin/golint -set_exit_status=1 ./...
unit-test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Run tests
run: go test ./...
|
Print the ouptut of `gofmt -d` (if any)
|
Print the ouptut of `gofmt -d` (if any)
|
YAML
|
apache-2.0
|
google/glome,google/glome,google/glome,google/glome,google/glome
|
e4a85742b12f8320f553d7f47b51bb56f5e7c3da
|
.github/workflows/python-tests.yml
|
.github/workflows/python-tests.yml
|
name: Run unit tests
on:
pull_request:
push:
branches:
- master
tags:
workflow_dispatch:
schedule:
# Run every Sunday at 03:53 UTC
- cron: 53 3 * * *
jobs:
build:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: [3.6, 3.7, 3.8, pypy3]
steps:
- uses: actions/checkout@v2
with:
fetch-depth: 0
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}
- name: Install FFmpeg
run: |
sudo apt install ffmpeg
ffmpeg -version
- name: Install Tox
run: python -m pip install tox tox-gh-actions coverage
- name: Run Tox
run: tox
- name: Convert coverage
run: python -m coverage xml
- name: Upload coverage to Codecov
uses: codecov/codecov-action@v1
with:
fail_ci_if_error: true
|
name: Run unit tests
on:
pull_request:
push:
branches:
- master
tags:
workflow_dispatch:
schedule:
# Run every Sunday at 03:53 UTC
- cron: 53 3 * * 0
jobs:
build:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: [3.6, 3.7, 3.8, pypy3]
steps:
- uses: actions/checkout@v2
with:
fetch-depth: 0
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}
- name: Install FFmpeg
run: |
sudo apt install ffmpeg
ffmpeg -version
- name: Install Tox
run: python -m pip install tox tox-gh-actions coverage
- name: Run Tox
run: tox
- name: Convert coverage
run: python -m coverage xml
- name: Upload coverage to Codecov
uses: codecov/codecov-action@v1
with:
fail_ci_if_error: true
|
Fix cron syntax to run weekly
|
Fix cron syntax to run weekly
|
YAML
|
mit
|
saimn/sigal,xouillet/sigal,xouillet/sigal,xouillet/sigal,saimn/sigal,saimn/sigal
|
f9e6f2576906f35ca8b1fa5bad41ba01f6a049ab
|
.drone.yml
|
.drone.yml
|
branches: dev
pipeline:
build:
image: fastscore/maker
commands:
- make build
volumes:
- /var/run/docker.sock:/var/run/docker.sock
workspace:
base: /_
path: .
|
branches: master
pipeline:
build:
image: fastscore/maker
commands:
- make build
volumes:
- /var/run/docker.sock:/var/run/docker.sock
workspace:
base: /_
path: .
|
Switch Drone to master branch
|
Switch Drone to master branch
|
YAML
|
apache-2.0
|
opendatagroup/fastscore-cli
|
00eb9d6aadda334ca003b89ed162b7ecbb9cf0b4
|
scripts/ci/pipeline.yml
|
scripts/ci/pipeline.yml
|
---
resources:
- name: garagepi
type: git
source:
uri: [email protected]:robdimsdale/garagepi
branch: develop
private_key: {{private-key}}
- name: slack_alert
type: slack-notification
source:
url: {{slack-url}}
- name: version
type: semver
source:
initial_version: 0.1.0
bucket: garagepi-releases
key: current-version
access_key_id: {{pipeline-bucket-access-key}}
secret_access_key: {{pipeline-bucket-secret-key}}
jobs:
- name: tests
public: false
plan:
- do:
- aggregate:
- get: version
params: {bump: minor}
- get: garagepi
trigger: true
- task: tests
file: garagepi/scripts/ci/tests.yml
- put: version
params: {file: version/number}
on_success:
put: slack_alert
params:
username: concourse
icon_url: http://cl.ly/image/3e1h0H3H2s0P/concourse-logo.png
channel: {{slack-channel}}
text: {{slack-success-text}}
on_failure:
put: slack_alert
params:
username: concourse
icon_url: http://cl.ly/image/3e1h0H3H2s0P/concourse-logo.png
channel: {{slack-channel}}
text: {{slack-failure-text}}
|
---
resources:
- name: garagepi
type: git
source:
uri: [email protected]:robdimsdale/garagepi
branch: develop
private_key: {{private-key}}
- name: slack-alert
type: slack-notification
source:
url: {{slack-url}}
- name: version
type: semver
source:
initial_version: 0.1.0
bucket: garagepi-releases
key: current-version
access_key_id: {{pipeline-bucket-access-key}}
secret_access_key: {{pipeline-bucket-secret-key}}
jobs:
- name: tests
public: false
plan:
- do:
- aggregate:
- get: version
params: {bump: minor}
- get: garagepi
trigger: true
- task: tests
file: garagepi/scripts/ci/tests.yml
- put: version
params: {file: version/number}
on_success:
put: slack-alert
params:
username: concourse
icon_url: http://cl.ly/image/3e1h0H3H2s0P/concourse-logo.png
channel: {{slack-channel}}
text: {{slack-success-text}}
on_failure:
put: slack-alert
params:
username: concourse
icon_url: http://cl.ly/image/3e1h0H3H2s0P/concourse-logo.png
channel: {{slack-channel}}
text: {{slack-failure-text}}
|
Rename slack_alert resource to slack-alert
|
Rename slack_alert resource to slack-alert
|
YAML
|
mit
|
robdimsdale/garagepi,robdimsdale/garagepi,robdimsdale/garagepi
|
3bef17b5a112bf0b576ad88cc162b2cb2fc2a784
|
hieradata/dev01/roles/master.yaml
|
hieradata/dev01/roles/master.yaml
|
---
#profile::firewall::rules:
# '020 allow all from transport network to public ip':
# destination: "%{ipaddress_public1}"
# source: '172.31.34.0/24'
|
---
#profile::firewall::rules:
# '020 allow all from transport network to public ip':
# destination: "%{ipaddress_public1}"
# source: '172.31.34.0/24'
profile::openstack::createnetworks::networks:
public:
name: 'public'
admin_state_up: true
shared: true
tenant_name: 'openstack'
provider_network_type: 'local'
|
Add a network in dev01 location
|
Add a network in dev01 location
|
YAML
|
apache-2.0
|
TorLdre/himlar,tanzr/himlar,norcams/himlar,raykrist/himlar,tanzr/himlar,tanzr/himlar,mikaeld66/himlar,norcams/himlar,eckhart/himlar,raykrist/himlar,tanzr/himlar,raykrist/himlar,norcams/himlar,mikaeld66/himlar,mikaeld66/himlar,eckhart/himlar,TorLdre/himlar,tanzr/himlar,mikaeld66/himlar,raykrist/himlar,TorLdre/himlar,norcams/himlar,eckhart/himlar,TorLdre/himlar,TorLdre/himlar,raykrist/himlar,mikaeld66/himlar,eckhart/himlar,norcams/himlar
|
8c5c63628b420734b9021021c5b55ebc0433ff48
|
.zuul.yaml
|
.zuul.yaml
|
- project:
templates:
- golang-jobs
check:
jobs:
- golang-lint:
voting: false
- kubemon-build-image
gate:
jobs:
- golang-lint:
voting: false
- kubemon-build-image
- job:
name: kubemon-build-image
parent: nuage-build-docker-image
vars:
go_task: make
go_context: nuagekubemon
go_makefile: scripts/Makefile
zuul_work_dir: "{{ ansible_user_dir }}/src/github.com/{{ zuul.project.name }}"
docker_images:
- context: nuagekubemon
dockerfile: Dockerfile
repository: nuage/kubemon
|
- project:
templates:
- golang-jobs
check:
jobs:
- golang-lint
- kubemon-build-image
gate:
jobs:
- golang-lint
- kubemon-build-image
- job:
name: kubemon-build-image
parent: nuage-build-docker-image
vars:
zuul_work_dir: "{{ ansible_user_dir }}/src/github.com/{{ zuul.project.name }}"
container_command: docker
docker_images:
- context: nuagekubemon
dockerfile: Dockerfile
repository: nuage/kubemon
go_task: make
go_context: nuagekubemon
go_makefile: scripts/Makefile
|
Move go vars to docker context
|
Move go vars to docker context
Change-Id: I14840e84eea3f0ffa02d95215db53e919133e2cb
|
YAML
|
bsd-3-clause
|
nuagenetworks/nuage-kubernetes,nuagenetworks/nuage-kubernetes,nuagenetworks/nuage-kubernetes,nuagenetworks/nuage-kubernetes
|
dd1131146f90d6a4d43ad8770fb9f6bc43bada19
|
.zuul.yaml
|
.zuul.yaml
|
- project:
templates:
- check-requirements
- openstack-lower-constraints-jobs
- openstack-python3-victoria-jobs
- release-notes-jobs-python3
|
- project:
templates:
- check-requirements
- openstack-lower-constraints-jobs
- openstack-python3-wallaby-jobs
- release-notes-jobs-python3
|
Add Python3 wallaby unit tests
|
Add Python3 wallaby unit tests
This is an automatically generated patch to ensure unit testing
is in place for all the of the tested runtimes for wallaby.
See also the PTI in governance [1].
[1]: https://governance.openstack.org/tc/reference/project-testing-interface.html
Change-Id: I7255c27886820dadef3a1fb12d0f2dfb86216d03
|
YAML
|
apache-2.0
|
openstack/murano-agent,openstack/murano-agent,openstack/murano-agent,openstack/murano-agent
|
02a61d0402b9cceaf9f2a688de7085c57f02fe79
|
.zuul.yaml
|
.zuul.yaml
|
- project:
check:
jobs:
- oslo.versionedobjects-src-grenade-multinode
templates:
- check-requirements
- lib-forward-testing-python3
- openstack-lower-constraints-jobs
- openstack-python3-victoria-jobs
- periodic-stable-jobs
- publish-openstack-docs-pti
- release-notes-jobs-python3
- job:
name: oslo.versionedobjects-src-grenade-multinode
parent: grenade-multinode
voting: false
irrelevant-files:
- ^(test-|)requirements.txt$
- ^setup.cfg$
required-projects:
- opendev.org/openstack/oslo.versionedobjects
|
- project:
check:
jobs:
- oslo.versionedobjects-src-grenade-multinode
templates:
- check-requirements
- lib-forward-testing-python3
- openstack-lower-constraints-jobs
- openstack-python3-wallaby-jobs
- periodic-stable-jobs
- publish-openstack-docs-pti
- release-notes-jobs-python3
- job:
name: oslo.versionedobjects-src-grenade-multinode
parent: grenade-multinode
voting: false
irrelevant-files:
- ^(test-|)requirements.txt$
- ^setup.cfg$
required-projects:
- opendev.org/openstack/oslo.versionedobjects
|
Add Python3 wallaby unit tests
|
Add Python3 wallaby unit tests
This is an automatically generated patch to ensure unit testing
is in place for all the of the tested runtimes for wallaby.
See also the PTI in governance [1].
[1]: https://governance.openstack.org/tc/reference/project-testing-interface.html
Change-Id: Icdc9ca232efbda2c0fa685479d4bbb0edca1e575
|
YAML
|
apache-2.0
|
openstack/oslo.versionedobjects
|
b1442e9a0b3f987c9e6a86985e823a130b91c17e
|
.zuul.yaml
|
.zuul.yaml
|
- project:
check:
jobs:
- monasca-tempest-python-mysql:
voting: false
- monasca-tempest-python-postgresql:
voting: false
- monasca-tempest-java-mysql:
voting: false
- monasca-tempest-java-postgresql:
voting: false
|
- project:
check:
jobs:
- monasca-tempest-python-influxdb:
voting: false
- monasca-tempest-java-influxdb:
voting: false
|
Remove PostgreSQL tempest jobs from Zuul
|
Remove PostgreSQL tempest jobs from Zuul
* remove monasca-tempest-*-postgresql jobs
* rename monasca-tempest-*-mysql -> monasca-tempest-*-influxdb jobs
Story: 2001650
Task: 6670
Depends-On: https://review.openstack.org/550795
Change-Id: I8ec6bc2189f21af06ec43d739e024a23eb4c9f95
|
YAML
|
bsd-3-clause
|
stackforge/monasca-statsd,stackforge/monasca-statsd
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.