hexsha
stringlengths 40
40
| size
int64 5
1.04M
| ext
stringclasses 6
values | lang
stringclasses 1
value | max_stars_repo_path
stringlengths 3
344
| max_stars_repo_name
stringlengths 5
125
| max_stars_repo_head_hexsha
stringlengths 40
78
| max_stars_repo_licenses
sequencelengths 1
11
| max_stars_count
int64 1
368k
⌀ | max_stars_repo_stars_event_min_datetime
stringlengths 24
24
⌀ | max_stars_repo_stars_event_max_datetime
stringlengths 24
24
⌀ | max_issues_repo_path
stringlengths 3
344
| max_issues_repo_name
stringlengths 5
125
| max_issues_repo_head_hexsha
stringlengths 40
78
| max_issues_repo_licenses
sequencelengths 1
11
| max_issues_count
int64 1
116k
⌀ | max_issues_repo_issues_event_min_datetime
stringlengths 24
24
⌀ | max_issues_repo_issues_event_max_datetime
stringlengths 24
24
⌀ | max_forks_repo_path
stringlengths 3
344
| max_forks_repo_name
stringlengths 5
125
| max_forks_repo_head_hexsha
stringlengths 40
78
| max_forks_repo_licenses
sequencelengths 1
11
| max_forks_count
int64 1
105k
⌀ | max_forks_repo_forks_event_min_datetime
stringlengths 24
24
⌀ | max_forks_repo_forks_event_max_datetime
stringlengths 24
24
⌀ | content
stringlengths 5
1.04M
| avg_line_length
float64 1.14
851k
| max_line_length
int64 1
1.03M
| alphanum_fraction
float64 0
1
| lid
stringclasses 191
values | lid_prob
float64 0.01
1
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
61df488fd80bb2a329d43feede497020fbd10d5b | 2,841 | md | Markdown | README.md | ryohei-kamiya/2D-GRD | 52b6dea0369f020838238d64bd7e4bf137a70836 | [
"Apache-2.0"
] | 7 | 2018-02-02T09:45:28.000Z | 2018-04-24T11:15:00.000Z | README.md | kawasaki2013/2D-GRD | 52b6dea0369f020838238d64bd7e4bf137a70836 | [
"Apache-2.0"
] | null | null | null | README.md | kawasaki2013/2D-GRD | 52b6dea0369f020838238d64bd7e4bf137a70836 | [
"Apache-2.0"
] | 2 | 2018-12-03T08:57:27.000Z | 2019-07-15T02:16:25.000Z | # 2D-GRD
## 概要
ソニーのNeural Network Console / Libariesを使って二次元のジェスチャー認識/予測を行うトイ・プログラムです。[ソニーのNeural Network Console大勉強会 #2](https://delta.connpass.com/event/77373/)用のコンテンツとして作成しました。その際の発表資料は[こちら](https://www.slideshare.net/k38ryh/neural-network-console2)に上げています。
本リポジトリは以下のプログラム(Python3スクリプト)を含みます。
- 2次元のジェスチャー(一筆書きの点列データ)を記録するプログラム
- src/gesture_painter.py
- 記録したオリジナルのジェスチャーを非線形の幾何変換でデータ拡張するプログラム
- src/make_datafiles.py
- ソニーのNeural Network Librariesを使うシンプルなジェスチャー認識器/予測器の学習・評価用プログラム
- src/mlp.py # 多層パーセプトロン
- src/lenet.py # LeNet(BatchNormalizationを利用した改変版)
- src/lstm.py # LSTM
- src/lstm_with_baseshift.py # LSTM版ジェスチャー予測器作成・評価プログラム(現時刻の座標を(0,0)にシフト)
- src/delta2_lstm_trainer.py # ジェスチャー予測器の学習プログラム(lstm_with_baseshift.pyから学習部分だけ抜き出したもの)
- 学習したジェスチャー認識器/予測器を使ってジェスチャー認識・予測をするプログラム
- src/delta2_mlp_gesture_recognizer.py # MLPでジェスチャーを認識するプログラム
- src/delta2_mlp_with_lstm_gesture_recognizer.py # LSTMでジェスチャーを予測し、MLPで認識するプログラム
- src/gesture_recognizer.py # MLP、LeNet、LSTM各々を切り替えて使えるジェスチャー認識・予測プログラム
## データ拡張方法(ジェスチャーパターン生成方法)
src/gesture_painter.pyで記録したオリジナルのデータに対して、make_datafiles.pyでhomography変換とガウス関数による空間歪曲を複数回適用して拡張します。具体的な生成方法は各ソースコードをご確認ください。
## データセット作成方法
src/gesture_painter.pyとsrc/make_datafiles.pyでジェスチャーパターンを生成後、以下のスクリプトによって各データセットを生成します。詳細は各スクリプトの内容をご確認ください。
- 点列データセット生成スクリプト
- src/make-points-dataset.sh
- 画像データセット生成スクリプト
- src/make-image-dataset.sh
- 点の数が一定の点列データセット生成スクリプト
- src/make-sampled_points-dataset.sh
## 学習済みモデルパラメータファイル・クラスラベルファイル
ジェスチャー予測・認識プログラムの動作確認用の学習済みモデルパラメータファイルとクラスラベルファイルは以下にあります。
なお、これらはあくまで動作確認用のサンプルです。性能は最適化していません。
- 学習済みモデルパラメータファイル
- models/mlp-parameters.h5 (多層パーセプトロンのモデルパラメータファイル)
- models/lenet-parameters.h5 (LeNetのモデルパラメータファイル)
- models/lstm-with-baseshift-parameters.h5 (LSTMのモデルパラメータファイル)
- クラスラベルファイル
- labels.txt
## ランチャースクリプト
以下は、上記の各スクリプトのコマンドライン引数を記述したランチャースクリプト(bash シェルスクリプト)です。
- 多層パーセプトロンの学習・評価用ランチャースクリプト
- src/mlp-train.sh (学習用)
- src/mlp-evaluate.sh (評価用)
- src/mlp-infer.sh (推論関数の動作確認用)
- LeNetの学習・評価用ランチャースクリプト
- src/lenet-train.sh (学習用)
- src/lenet-evaluate.sh (評価用)
- src/lenet-infer.sh (推論関数の動作確認用)
- LSTMの学習・評価用ランチャースクリプト
- src/lstm-train.sh (学習用)
- src/lstm-evaluate.sh (評価用)
- src/lstm-infer.sh (推論関数の動作確認用)
- src/lstm-with-baseshift-train.sh(lstm_with_baseshift.pyの学習処理を実行)
- src/run-delta2-lstm-trainer.sh(delta2_lstm_trainer.pyを実行)
- 多層パーセプトロンを用いてジェスチャー認識するランチャースクリプト
- src/run-delta2-mlp-gesture-recognizer.sh
- src/run-mlp-gesture-recognizer.sh
- LeNetを用いてジェスチャー認識するランチャースクリプト
- src/run-lenet-gesture-recognizer.sh
- LSTMを用いてジェスチャーの軌跡を予測し、多層パーセプトロンで認識するランチャースクリプト
- src/run-delta2-mlp-with-lstm-gesture-recognizer.sh
- src/run-mlp-with-lstm-gesture-recognizer.sh
## メンテナンス
フィードバック歓迎します。特にバグがある場合はご連絡いただけるとうれしいです。
| 38.391892 | 246 | 0.787399 | yue_Hant | 0.529739 |
61df6353b58e5260254f27fb0ed2f5081e4573e8 | 1,717 | md | Markdown | docs/deployment/install.md | kevin70/houge | 820434f25241e53bff18d7098b257591ed094fd2 | [
"Apache-2.0"
] | 4 | 2021-01-27T03:55:05.000Z | 2021-04-14T06:32:21.000Z | docs/deployment/install.md | tanbinh123/houge | 820434f25241e53bff18d7098b257591ed094fd2 | [
"Apache-2.0"
] | null | null | null | docs/deployment/install.md | tanbinh123/houge | 820434f25241e53bff18d7098b257591ed094fd2 | [
"Apache-2.0"
] | 3 | 2021-12-09T23:29:09.000Z | 2022-03-14T07:43:28.000Z | # Houge 安装部署
默认已提供 Docker 镜像构建发布,推荐使用 Docker 部署 Houge 相关服务。在阅读文档之前请先确保已安装 Docker
相关环境 [Get Docker](https://docs.docker.com/get-docker/) 。
## 先决条件
使用如下命令查看 Docker 版本信息,确认 Docker 已经安装。
```
$ docker --version
Docker version 20.10.2, build 2291f61
```
## 使用 Docker Compose 部署服务
如果未安装 Docker Compose 请参考 [Install Docker Compose](https://docs.docker.com/compose/install/) 文档安装。
Houge 已编写 [docker-compose.yml](https://gitee.com/kk70/tethys/blob/main/docker-compose.yml) 配置可快速帮助部署 Tethys 相关服务。
### 获取 docker-compose.yml
```
$ curl https://gitee.com/kk70/tethys/raw/main/docker-compose.yml > docker-compose.yml
```
### 启动服务
```
docker-compose up -d
```
### 查看日志
```
$ docker logs tethys-server-logic
exec java -XX:+ExitOnOutOfMemoryError -cp . -jar /app/app.jar
... 省略的日志
07:01:07.769 [reactor-tcp-epoll-1] INFO AbstractApplicationIdentifier 117 - 新增 ServerInstance: ServerInstance(id=10133, appName=tethys-logic, hostName=f66eb00b96ef, hostAddress=172.28.0.6, osName=Linux, osVersion=4.19.128-microsoft-standard, osArch=amd64, osUser=root, javaVmName=OpenJDK 64-Bit Server VM, javaVmVersion=11.0.11+9, javaVmVendor=Oracle Corporation, workDir=/app, pid=7, ver=0, createTime=null, checkTime=null)
07:01:07.950 [main] INFO LogicServer 64 - Logic gRPC服务启动成功 [/0.0.0.0:11012]
```
```
$ docker logs tethys-server-rest
exec java -XX:+ExitOnOutOfMemoryError -cp . -jar /app/app.jar
... 省略的日志
07:01:07.768 [main] INFO RestServer 77 - REST Server 启动完成 - 0.0.0.0:11019
```
```
$ docker logs tethys-server-ws
exec java -XX:+ExitOnOutOfMemoryError -cp . -jar /app/app.jar
... 省略的日志
07:01:05.520 [main] INFO WsServer 70 - WS服务启动成功 [/0.0.0.0:11010]
```
使用 `docker logs` 命令查看 Houge 服务日志,当有上面的日志打印时代表 Houge 服务已启动成功。 | 27.253968 | 425 | 0.726849 | yue_Hant | 0.671641 |
61e09eb5b21e7a534d0ac7f272cd6942a7d2f8be | 272 | md | Markdown | README.md | wacko1805/Simple-PHP-chat-site | 7a591f9ab21ff772e59fea3e88058f7ed41657bf | [
"Apache-2.0"
] | 1 | 2020-11-10T05:42:01.000Z | 2020-11-10T05:42:01.000Z | README.md | wacko1805/Simple-PHP-chat-site | 7a591f9ab21ff772e59fea3e88058f7ed41657bf | [
"Apache-2.0"
] | null | null | null | README.md | wacko1805/Simple-PHP-chat-site | 7a591f9ab21ff772e59fea3e88058f7ed41657bf | [
"Apache-2.0"
] | 2 | 2021-10-03T21:34:13.000Z | 2022-03-27T01:41:30.000Z | # Simple-PHP-chat-site
Super Simple php script that allowes you to chat to anyone who is on the site.
It uses the php script to add to a .txt file. The txt file is displayed on the html page, as well as the form.
View demo here:
https://simple-php-chat.herokuapp.com/
| 38.857143 | 111 | 0.746324 | eng_Latn | 0.992507 |
61e0ba6f06d359cefd900c99b15f7d1af3b2ef0f | 1,662 | md | Markdown | catalog/dagashi-kashi/en-US_dagashi-kashi-tv-serie.md | htron-dev/baka-db | cb6e907a5c53113275da271631698cd3b35c9589 | [
"MIT"
] | 3 | 2021-08-12T20:02:29.000Z | 2021-09-05T05:03:32.000Z | catalog/dagashi-kashi/en-US_dagashi-kashi-tv-serie.md | zzhenryquezz/baka-db | da8f54a87191a53a7fca54b0775b3c00f99d2531 | [
"MIT"
] | 8 | 2021-07-20T00:44:48.000Z | 2021-09-22T18:44:04.000Z | catalog/dagashi-kashi/en-US_dagashi-kashi-tv-serie.md | zzhenryquezz/baka-db | da8f54a87191a53a7fca54b0775b3c00f99d2531 | [
"MIT"
] | 2 | 2021-07-19T01:38:25.000Z | 2021-07-29T08:10:29.000Z | # Dagashi Kashi

- **type**: tv-serie
- **episodes**: 12
- **original-name**: だがしかし
- **start-date**: 2016-01-08
- **end-date**: 2016-01-08
- **opening-song**: "Checkmate!?" by MICHI
- **ending-song**: "Hey! Calorie Queen (Hey! カロリー Queen)" by Ayana Taketatsu
- **rating**: PG-13 - Teens 13 or older
## Tags
- comedy
- shounen
- slice-of-life
## Sinopse
Out in the countryside stands a sweet shop run by the Shikada family for nine generations: Shikada Dagashi, a small business selling traditional Japanese candy. However, despite his father's pleas, Kokonotsu Shikada, an aspiring manga artist, adamantly refuses to inherit the family business.
However, this may start to change with the arrival of the eccentric Hotaru Shidare. Hotaru is in search of Kokonotsu's father, with the goal of bringing him back to work for her family's company, Shidare Corporation, a world famous sweets manufacturer. Although the senior Shikada initially refuses, he states that he will change his mind on one condition: if Hotaru can convince Kokonotsu to take over the family shop. And so begins Hotaru's mission to enlighten the boy on the true joy of delicious and nostalgic dagashi!
[Source My Anime List]
## Links
- [My Anime list](https://myanimelist.net/anime/31636/Dagashi_Kashi)
- [Official Site](http://www.tbs.co.jp/anime/dagashi/1st/)
- [AnimeDB](http://anidb.info/perl-bin/animedb.pl?show=anime&aid=11601)
- [AnimeNewsNetwork](http://www.animenewsnetwork.com/encyclopedia/anime.php?id=17723)
- [Wikipedia](http://en.wikipedia.org/wiki/Dagashi_Kashi)
| 47.485714 | 523 | 0.737064 | eng_Latn | 0.924767 |
61e0fb65f697fab2083d0a8baeec3a5ad68dcd59 | 48 | md | Markdown | README.md | oxr463/oxr463.github.io | 8865688c4fcf63d26162501e220fb5e6004b8a7b | [
"Unlicense"
] | null | null | null | README.md | oxr463/oxr463.github.io | 8865688c4fcf63d26162501e220fb5e6004b8a7b | [
"Unlicense"
] | null | null | null | README.md | oxr463/oxr463.github.io | 8865688c4fcf63d26162501e220fb5e6004b8a7b | [
"Unlicense"
] | null | null | null | # oxr463.github.io
<https://lramage.gitlab.io>
| 12 | 27 | 0.708333 | zul_Latn | 0.172962 |
61e1124a0605215c4b88ffba09a81608ed8546e1 | 33,562 | md | Markdown | articles/cognitive-services/text-analytics/includes/quickstarts/nodejs-sdk.md | Artaggedon/azure-docs.es-es | 73e6ff211a5d55a2b8293a4dc137c48a63ed1369 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/cognitive-services/text-analytics/includes/quickstarts/nodejs-sdk.md | Artaggedon/azure-docs.es-es | 73e6ff211a5d55a2b8293a4dc137c48a63ed1369 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/cognitive-services/text-analytics/includes/quickstarts/nodejs-sdk.md | Artaggedon/azure-docs.es-es | 73e6ff211a5d55a2b8293a4dc137c48a63ed1369 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: 'Inicio rápido: Biblioteca cliente v3 de Text Analytics para Node.js | Microsoft Docs'
description: Introducción a la biblioteca cliente v3 de Text Analytics para Node.js.
author: aahill
manager: nitinme
ms.service: cognitive-services
ms.subservice: text-analytics
ms.topic: include
ms.date: 04/19/2021
ms.author: aahi
ms.reviewer: sumeh, assafi
ms.custom: devx-track-js
ms.openlocfilehash: 72ca331546d53f85ca82f33ec6a02558d91f1c1e
ms.sourcegitcommit: 4b0e424f5aa8a11daf0eec32456854542a2f5df0
ms.translationtype: HT
ms.contentlocale: es-ES
ms.lasthandoff: 04/20/2021
ms.locfileid: "107765106"
---
<a name="HOLTop"></a>
# <a name="version-31-preview"></a>[Versión 3.1 (versión preliminar)](#tab/version-3-1)
[Documentación de referencia de v3](/javascript/api/overview/azure/ai-text-analytics-readme?preserve-view=true&view=azure-node-preview) | [Código fuente de la biblioteca v3](https://github.com/Azure/azure-sdk-for-js/tree/master/sdk/textanalytics/ai-text-analytics) | [Paquete v3 (NPM)](https://www.npmjs.com/package/@azure/ai-text-analytics) | [Ejemplos de v3](https://github.com/Azure/azure-sdk-for-js/tree/master/sdk/textanalytics/ai-text-analytics/samples)
# <a name="version-30"></a>[Versión 3.0](#tab/version-3)
[Documentación de referencia de v3](/javascript/api/overview/azure/ai-text-analytics-readme) | [Código fuente de la biblioteca v3](https://github.com/Azure/azure-sdk-for-js/tree/master/sdk/textanalytics/ai-text-analytics) | [Paquete v3 (NPM)](https://www.npmjs.com/package/@azure/ai-text-analytics) | [Ejemplos de v3](https://github.com/Azure/azure-sdk-for-js/tree/master/sdk/textanalytics/ai-text-analytics/samples)
---
## <a name="prerequisites"></a>Requisitos previos
* Una suscripción a Azure: [cree una cuenta gratuita](https://azure.microsoft.com/free/cognitive-services)
* La versión actual de [Node.js](https://nodejs.org/).
* Una vez que tenga la suscripción de Azure, <a href="https://ms.portal.azure.com/#create/Microsoft.CognitiveServicesTextAnalytics" title="cree un recurso de Text Analytics" target="_blank">cree un recurso de Text Analytics </a> en Azure Portal para obtener la clave y el punto de conexión. Una vez que se implemente, haga clic en **Ir al recurso**.
* Necesitará la clave y el punto de conexión del recurso que cree para conectar la aplicación a API Text Analytics. En una sección posterior de este mismo inicio rápido pegará la clave y el punto de conexión en el código siguiente.
* Puede usar el plan de tarifa gratis (`F0`) para probar el servicio y actualizarlo más adelante a un plan de pago para producción.
* Para usar la característica Analyze, necesitará un recurso de Text Analytics con un plan de tarifa Estándar (S).
## <a name="setting-up"></a>Instalación
### <a name="create-a-new-nodejs-application"></a>Creación de una aplicación Node.js
En una ventana de la consola (como cmd, PowerShell o Bash), cree un directorio para la aplicación y vaya a él.
```console
mkdir myapp
cd myapp
```
Ejecute el comando `npm init` para crear una aplicación de nodo con un archivo `package.json`.
```console
npm init
```
### <a name="install-the-client-library"></a>Instalación de la biblioteca cliente
# <a name="version-31-preview"></a>[Versión 3.1 (versión preliminar)](#tab/version-3-1)
Instale los paquetes NPM `@azure/ai-text-analytics`:
```console
npm install --save @azure/[email protected]
```
> [!TIP]
> ¿Desea ver todo el archivo de código de inicio rápido de una vez? Puede encontrarlo [en GitHub](https://github.com/Azure-Samples/cognitive-services-quickstart-code/blob/master/javascript/TextAnalytics/text-analytics-v3-client-library.js), que contiene los ejemplos de código de este inicio rápido.
# <a name="version-30"></a>[Versión 3.0](#tab/version-3)
Instale los paquetes NPM `@azure/ai-text-analytics`:
```console
npm install --save @azure/[email protected]
```
> [!TIP]
> ¿Desea ver todo el archivo de código de inicio rápido de una vez? Puede encontrarlo [en GitHub](https://github.com/Azure-Samples/cognitive-services-quickstart-code/blob/master/javascript/TextAnalytics/text-analytics-v3-client-library.js), que contiene los ejemplos de código de este inicio rápido.
---
el archivo `package.json` de la aplicación se actualizará con las dependencias.
Cree un archivo llamado `index.js` y agregue los siguientes elementos:
# <a name="version-31-preview"></a>[Versión 3.1 (versión preliminar)](#tab/version-3-1)
```javascript
"use strict";
const { TextAnalyticsClient, AzureKeyCredential } = require("@azure/ai-text-analytics");
```
# <a name="version-30"></a>[Versión 3.0](#tab/version-3)
```javascript
"use strict";
const { TextAnalyticsClient, AzureKeyCredential } = require("@azure/ai-text-analytics");
```
---
Cree variables para el punto de conexión y la clave de Azure del recurso.
[!INCLUDE [text-analytics-find-resource-information](../find-azure-resource-info.md)]
```javascript
const key = '<paste-your-text-analytics-key-here>';
const endpoint = '<paste-your-text-analytics-endpoint-here>';
```
## <a name="object-model"></a>Modelo de objetos
El cliente de Text Analytics es un objeto `TextAnalyticsClient` que se autentica en Azure mediante la clave. El cliente proporciona varios métodos para analizar texto, como una sola cadena o un lote.
El texto se envía a la API como una lista de `documents`, que son objetos `dictionary` que contienen una combinación de atributos `id`, `text` y `language`, según el método utilizado. El atributo `text` almacena el texto que se va a analizar en el origen `language` y `id` puede ser cualquier valor.
El objeto de respuesta es una lista que contiene la información de análisis para cada documento.
## <a name="code-examples"></a>Ejemplos de código
* [Autenticación de cliente](#client-authentication)
* [Análisis de sentimiento](#sentiment-analysis)
* [Minería de opiniones](#opinion-mining)
* [Detección de idioma](#language-detection)
* [Reconocimiento de entidades con nombre](#named-entity-recognition-ner)
* [Vinculación de entidad](#entity-linking)
* Información de identificación personal
* [Extracción de frases clave](#key-phrase-extraction)
## <a name="client-authentication"></a>Autenticación de clientes
# <a name="version-31-preview"></a>[Versión 3.1 (versión preliminar)](#tab/version-3-1)
Cree un objeto `TextAnalyticsClient` con el punto de conexión y la clave como parámetros.
```javascript
const textAnalyticsClient = new TextAnalyticsClient(endpoint, new AzureKeyCredential(key));
```
# <a name="version-30"></a>[Versión 3.0](#tab/version-3)
Cree un objeto `TextAnalyticsClient` con el punto de conexión y la clave como parámetros.
```javascript
const textAnalyticsClient = new TextAnalyticsClient(endpoint, new AzureKeyCredential(key));
```
---
## <a name="sentiment-analysis"></a>análisis de opiniones
# <a name="version-31-preview"></a>[Versión 3.1 (versión preliminar)](#tab/version-3-1)
Cree una matriz de cadenas que contenga el documento que quiere analizar. Llame al método `analyzeSentiment()` del cliente y obtenga el objeto `SentimentBatchResult` devuelto. Recorra en iteración la lista de resultados e imprima el identificador de cada documento y la opinión de nivel de documento con puntuaciones de confianza. El resultado contiene, para cada documento, un sentimiento de nivel de oración junto con las puntuaciones de desplazamientos, longitud y confianza.
```javascript
async function sentimentAnalysis(client){
const sentimentInput = [
"I had the best day of my life. I wish you were there with me."
];
const sentimentResult = await client.analyzeSentiment(sentimentInput);
sentimentResult.forEach(document => {
console.log(`ID: ${document.id}`);
console.log(`\tDocument Sentiment: ${document.sentiment}`);
console.log(`\tDocument Scores:`);
console.log(`\t\tPositive: ${document.confidenceScores.positive.toFixed(2)} \tNegative: ${document.confidenceScores.negative.toFixed(2)} \tNeutral: ${document.confidenceScores.neutral.toFixed(2)}`);
console.log(`\tSentences Sentiment(${document.sentences.length}):`);
document.sentences.forEach(sentence => {
console.log(`\t\tSentence sentiment: ${sentence.sentiment}`)
console.log(`\t\tSentences Scores:`);
console.log(`\t\tPositive: ${sentence.confidenceScores.positive.toFixed(2)} \tNegative: ${sentence.confidenceScores.negative.toFixed(2)} \tNeutral: ${sentence.confidenceScores.neutral.toFixed(2)}`);
});
});
}
sentimentAnalysis(textAnalyticsClient)
```
Ejecute el código con `node index.js` en la ventana de la consola.
### <a name="output"></a>Output
```console
ID: 0
Document Sentiment: positive
Document Scores:
Positive: 1.00 Negative: 0.00 Neutral: 0.00
Sentences Sentiment(2):
Sentence sentiment: positive
Sentences Scores:
Positive: 1.00 Negative: 0.00 Neutral: 0.00
Sentence sentiment: neutral
Sentences Scores:
Positive: 0.21 Negative: 0.02 Neutral: 0.77
```
### <a name="opinion-mining"></a>Minería de opiniones
Para realizar el análisis de sentimiento con la minería de opiniones, cree una matriz de cadenas que contenga el documento que quiere analizar. Llame al método `analyzeSentiment()` del cliente con la marca de opción de adición `includeOpinionMining: true` y obtenga el objeto `SentimentBatchResult` devuelto. Recorra en iteración la lista de resultados e imprima el identificador de cada documento y la opinión de nivel de documento con puntuaciones de confianza. Para cada documento, el resultado contiene la opinión de nivel de oración, así como el aspecto y el sentimiento de nivel de opinión.
```javascript
async function sentimentAnalysisWithOpinionMining(client){
const sentimentInput = [
{
text: "The food and service were unacceptable, but the concierge were nice",
id: "0",
language: "en"
}
];
const results = await client.analyzeSentiment(sentimentInput, { includeOpinionMining: true });
for (let i = 0; i < results.length; i++) {
const result = results[i];
console.log(`- Document ${result.id}`);
if (!result.error) {
console.log(`\tDocument text: ${sentimentInput[i].text}`);
console.log(`\tOverall Sentiment: ${result.sentiment}`);
console.log("\tSentiment confidence scores:", result.confidenceScores);
console.log("\tSentences");
for (const { sentiment, confidenceScores, opinions } of result.sentences) {
console.log(`\t- Sentence sentiment: ${sentiment}`);
console.log("\t Confidence scores:", confidenceScores);
console.log("\t Mined opinions");
for (const { target, assessments } of opinions) {
console.log(`\t\t- Target text: ${target.text}`);
console.log(`\t\t Target sentiment: ${target.sentiment}`);
console.log("\t\t Target confidence scores:", target.confidenceScores);
console.log("\t\t Target assessments");
for (const { text, sentiment } of assessments) {
console.log(`\t\t\t- Text: ${text}`);
console.log(`\t\t\t Sentiment: ${sentiment}`);
}
}
}
} else {
console.error(`\tError: ${result.error}`);
}
}
}
sentimentAnalysisWithOpinionMining(textAnalyticsClient)
```
Ejecute el código con `node index.js` en la ventana de la consola.
### <a name="output"></a>Output
```console
- Document 0
Document text: The food and service were unacceptable, but the concierge were nice
Overall Sentiment: positive
Sentiment confidence scores: { positive: 0.84, neutral: 0, negative: 0.16 }
Sentences
- Sentence sentiment: positive
Confidence scores: { positive: 0.84, neutral: 0, negative: 0.16 }
Mined opinions
- Target text: food
Target sentiment: negative
Target confidence scores: { positive: 0.01, negative: 0.99 }
Target assessments
- Text: unacceptable
Sentiment: negative
- Target text: service
Target sentiment: negative
Target confidence scores: { positive: 0.01, negative: 0.99 }
Target assessments
- Text: unacceptable
Sentiment: negative
- Target text: concierge
Target sentiment: positive
Target confidence scores: { positive: 1, negative: 0 }
Target assessments
- Text: nice
Sentiment: positive
```
# <a name="version-30"></a>[Versión 3.0](#tab/version-3)
Cree una matriz de cadenas que contenga el documento que quiere analizar. Llame al método `analyzeSentiment()` del cliente y obtenga el objeto `SentimentBatchResult` devuelto. Recorra en iteración la lista de resultados e imprima el identificador de cada documento y la opinión de nivel de documento con puntuaciones de confianza. El resultado contiene, para cada documento, un sentimiento de nivel de oración junto con las puntuaciones de desplazamientos, longitud y confianza.
```javascript
async function sentimentAnalysis(client){
const sentimentInput = [
"I had the best day of my life. I wish you were there with me."
];
const sentimentResult = await client.analyzeSentiment(sentimentInput);
sentimentResult.forEach(document => {
console.log(`ID: ${document.id}`);
console.log(`\tDocument Sentiment: ${document.sentiment}`);
console.log(`\tDocument Scores:`);
console.log(`\t\tPositive: ${document.confidenceScores.positive.toFixed(2)} \tNegative: ${document.confidenceScores.negative.toFixed(2)} \tNeutral: ${document.confidenceScores.neutral.toFixed(2)}`);
console.log(`\tSentences Sentiment(${document.sentences.length}):`);
document.sentences.forEach(sentence => {
console.log(`\t\tSentence sentiment: ${sentence.sentiment}`)
console.log(`\t\tSentences Scores:`);
console.log(`\t\tPositive: ${sentence.confidenceScores.positive.toFixed(2)} \tNegative: ${sentence.confidenceScores.negative.toFixed(2)} \tNeutral: ${sentence.confidenceScores.neutral.toFixed(2)}`);
});
});
}
sentimentAnalysis(textAnalyticsClient)
```
Ejecute el código con `node index.js` en la ventana de la consola.
### <a name="output"></a>Output
```console
ID: 0
Document Sentiment: positive
Document Scores:
Positive: 1.00 Negative: 0.00 Neutral: 0.00
Sentences Sentiment(2):
Sentence sentiment: positive
Sentences Scores:
Positive: 1.00 Negative: 0.00 Neutral: 0.00
Sentence sentiment: neutral
Sentences Scores:
Positive: 0.21 Negative: 0.02 Neutral: 0.77
```
---
## <a name="language-detection"></a>Detección de idiomas
# <a name="version-31-preview"></a>[Versión 3.1 (versión preliminar)](#tab/version-3-1)
Cree una matriz de cadenas que contenga el documento que quiere analizar. Llame al método `detectLanguage()` del cliente y obtenga el objeto `DetectLanguageResultCollection` devuelto. Luego, recorra en iteración los resultados e imprima el identificador de cada documento y el idioma principal respectivo.
```javascript
async function languageDetection(client) {
const languageInputArray = [
"Ce document est rédigé en Français."
];
const languageResult = await client.detectLanguage(languageInputArray);
languageResult.forEach(document => {
console.log(`ID: ${document.id}`);
console.log(`\tPrimary Language ${document.primaryLanguage.name}`)
});
}
languageDetection(textAnalyticsClient);
```
Ejecute el código con `node index.js` en la ventana de la consola.
### <a name="output"></a>Output
```console
ID: 0
Primary Language French
```
# <a name="version-30"></a>[Versión 3.0](#tab/version-3)
Cree una matriz de cadenas que contenga el documento que quiere analizar. Llame al método `detectLanguage()` del cliente y obtenga el objeto `DetectLanguageResultCollection` devuelto. Luego, recorra en iteración los resultados e imprima el identificador de cada documento y el idioma principal respectivo.
```javascript
async function languageDetection(client) {
const languageInputArray = [
"Ce document est rédigé en Français."
];
const languageResult = await client.detectLanguage(languageInputArray);
languageResult.forEach(document => {
console.log(`ID: ${document.id}`);
console.log(`\tPrimary Language ${document.primaryLanguage.name}`)
});
}
languageDetection(textAnalyticsClient);
```
Ejecute el código con `node index.js` en la ventana de la consola.
### <a name="output"></a>Output
```console
ID: 0
Primary Language French
```
---
## <a name="named-entity-recognition-ner"></a>Reconocimiento de entidades con nombre (NER)
# <a name="version-31-preview"></a>[Versión 3.1 (versión preliminar)](#tab/version-3-1)
> [!NOTE]
> En la versión `3.1`:
> * La vinculación de entidad es una solicitud independiente distinta a la de NER.
Cree una matriz de cadenas que contenga el documento que quiere analizar. Llame al método `recognizeEntities()` del cliente y obtenga el objeto `RecognizeEntitiesResult`. Recorra en iteración la lista de resultados e imprima el nombre, el tipo, el subtipo, el desplazamiento, la longitud y la puntuación de la entidad.
```javascript
async function entityRecognition(client){
const entityInputs = [
"Microsoft was founded by Bill Gates and Paul Allen on April 4, 1975, to develop and sell BASIC interpreters for the Altair 8800",
"La sede principal de Microsoft se encuentra en la ciudad de Redmond, a 21 kilómetros de Seattle."
];
const entityResults = await client.recognizeEntities(entityInputs);
entityResults.forEach(document => {
console.log(`Document ID: ${document.id}`);
document.entities.forEach(entity => {
console.log(`\tName: ${entity.text} \tCategory: ${entity.category} \tSubcategory: ${entity.subCategory ? entity.subCategory : "N/A"}`);
console.log(`\tScore: ${entity.confidenceScore}`);
});
});
}
entityRecognition(textAnalyticsClient);
```
Ejecute el código con `node index.js` en la ventana de la consola.
### <a name="output"></a>Output
```console
Document ID: 0
Name: Microsoft Category: Organization Subcategory: N/A
Score: 0.29
Name: Bill Gates Category: Person Subcategory: N/A
Score: 0.78
Name: Paul Allen Category: Person Subcategory: N/A
Score: 0.82
Name: April 4, 1975 Category: DateTime Subcategory: Date
Score: 0.8
Name: 8800 Category: Quantity Subcategory: Number
Score: 0.8
Document ID: 1
Name: 21 Category: Quantity Subcategory: Number
Score: 0.8
Name: Seattle Category: Location Subcategory: GPE
Score: 0.25
```
### <a name="entity-linking"></a>Entity Linking
Cree una matriz de cadenas que contenga el documento que quiere analizar. Llame al método `recognizeLinkedEntities()` del cliente y obtenga el objeto `RecognizeLinkedEntitiesResult`. Recorra en iteración la lista de resultados e imprima el nombre, el identificador, el origen de datos, la dirección URL y las coincidencias de la entidad. Cada objeto de la matriz `matches` contendrá el desplazamiento, la longitud y la puntuación de esa coincidencia.
```javascript
async function linkedEntityRecognition(client){
const linkedEntityInput = [
"Microsoft was founded by Bill Gates and Paul Allen on April 4, 1975, to develop and sell BASIC interpreters for the Altair 8800. During his career at Microsoft, Gates held the positions of chairman, chief executive officer, president and chief software architect, while also being the largest individual shareholder until May 2014."
];
const entityResults = await client.recognizeLinkedEntities(linkedEntityInput);
entityResults.forEach(document => {
console.log(`Document ID: ${document.id}`);
document.entities.forEach(entity => {
console.log(`\tName: ${entity.name} \tID: ${entity.dataSourceEntityId} \tURL: ${entity.url} \tData Source: ${entity.dataSource}`);
console.log(`\tMatches:`)
entity.matches.forEach(match => {
console.log(`\t\tText: ${match.text} \tScore: ${match.confidenceScore.toFixed(2)}`);
})
});
});
}
linkedEntityRecognition(textAnalyticsClient);
```
Ejecute el código con `node index.js` en la ventana de la consola.
### <a name="output"></a>Output
```console
Document ID: 0
Name: Altair 8800 ID: Altair 8800 URL: https://en.wikipedia.org/wiki/Altair_8800 Data Source: Wikipedia
Matches:
Text: Altair 8800 Score: 0.88
Name: Bill Gates ID: Bill Gates URL: https://en.wikipedia.org/wiki/Bill_Gates Data Source: Wikipedia
Matches:
Text: Bill Gates Score: 0.63
Text: Gates Score: 0.63
Name: Paul Allen ID: Paul Allen URL: https://en.wikipedia.org/wiki/Paul_Allen Data Source: Wikipedia
Matches:
Text: Paul Allen Score: 0.60
Name: Microsoft ID: Microsoft URL: https://en.wikipedia.org/wiki/Microsoft Data Source: Wikipedia
Matches:
Text: Microsoft Score: 0.55
Text: Microsoft Score: 0.55
Name: April 4 ID: April 4 URL: https://en.wikipedia.org/wiki/April_4 Data Source: Wikipedia
Matches:
Text: April 4 Score: 0.32
Name: BASIC ID: BASIC URL: https://en.wikipedia.org/wiki/BASIC Data Source: Wikipedia
Matches:
Text: BASIC Score: 0.33
```
### <a name="personally-identifying-information-pii-recognition"></a>Reconocimiento de información de identificación personal (DCP)
Cree una matriz de cadenas que contenga el documento que quiere analizar. Llame al método `recognizePiiEntities()` del cliente y obtenga el objeto `RecognizePIIEntitiesResult`. Recorra en iteración la lista de resultados e imprima el nombre, el tipo y la puntuación de la entidad.
```javascript
async function piiRecognition(client) {
const documents = [
"The employee's phone number is (555) 555-5555."
];
const results = await client.recognizePiiEntities(documents, "en");
for (const result of results) {
if (result.error === undefined) {
console.log("Redacted Text: ", result.redactedText);
console.log(" -- Recognized PII entities for input", result.id, "--");
for (const entity of result.entities) {
console.log(entity.text, ":", entity.category, "(Score:", entity.confidenceScore, ")");
}
} else {
console.error("Encountered an error:", result.error);
}
}
}
piiRecognition(textAnalyticsClient)
```
Ejecute el código con `node index.js` en la ventana de la consola.
### <a name="output"></a>Output
```console
Redacted Text: The employee's phone number is **************.
-- Recognized PII entities for input 0 --
(555) 555-5555 : Phone Number (Score: 0.8 )
```
# <a name="version-30"></a>[Versión 3.0](#tab/version-3)
> [!NOTE]
> En la versión `3.0`:
> * La vinculación de entidad es una solicitud independiente distinta a la de NER.
Cree una matriz de cadenas que contenga el documento que quiere analizar. Llame al método `recognizeEntities()` del cliente y obtenga el objeto `RecognizeEntitiesResult`. Recorra en iteración la lista de resultados e imprima el nombre, el tipo, el subtipo, el desplazamiento, la longitud y la puntuación de la entidad.
```javascript
async function entityRecognition(client){
const entityInputs = [
"Microsoft was founded by Bill Gates and Paul Allen on April 4, 1975, to develop and sell BASIC interpreters for the Altair 8800",
"La sede principal de Microsoft se encuentra en la ciudad de Redmond, a 21 kilómetros de Seattle."
];
const entityResults = await client.recognizeEntities(entityInputs);
entityResults.forEach(document => {
console.log(`Document ID: ${document.id}`);
document.entities.forEach(entity => {
console.log(`\tName: ${entity.text} \tCategory: ${entity.category} \tSubcategory: ${entity.subCategory ? entity.subCategory : "N/A"}`);
console.log(`\tScore: ${entity.confidenceScore}`);
});
});
}
entityRecognition(textAnalyticsClient);
```
Ejecute el código con `node index.js` en la ventana de la consola.
### <a name="output"></a>Output
```console
Document ID: 0
Name: Microsoft Category: Organization Subcategory: N/A
Score: 0.29
Name: Bill Gates Category: Person Subcategory: N/A
Score: 0.78
Name: Paul Allen Category: Person Subcategory: N/A
Score: 0.82
Name: April 4, 1975 Category: DateTime Subcategory: Date
Score: 0.8
Name: 8800 Category: Quantity Subcategory: Number
Score: 0.8
Document ID: 1
Name: 21 Category: Quantity Subcategory: Number
Score: 0.8
Name: Seattle Category: Location Subcategory: GPE
Score: 0.25
```
### <a name="entity-linking"></a>Entity Linking
Cree una matriz de cadenas que contenga el documento que quiere analizar. Llame al método `recognizeLinkedEntities()` del cliente y obtenga el objeto `RecognizeLinkedEntitiesResult`. Recorra en iteración la lista de resultados e imprima el nombre, el identificador, el origen de datos, la dirección URL y las coincidencias de la entidad. Cada objeto de la matriz `matches` contendrá el desplazamiento, la longitud y la puntuación de esa coincidencia.
```javascript
async function linkedEntityRecognition(client){
const linkedEntityInput = [
"Microsoft was founded by Bill Gates and Paul Allen on April 4, 1975, to develop and sell BASIC interpreters for the Altair 8800. During his career at Microsoft, Gates held the positions of chairman, chief executive officer, president and chief software architect, while also being the largest individual shareholder until May 2014."
];
const entityResults = await client.recognizeLinkedEntities(linkedEntityInput);
entityResults.forEach(document => {
console.log(`Document ID: ${document.id}`);
document.entities.forEach(entity => {
console.log(`\tName: ${entity.name} \tID: ${entity.dataSourceEntityId} \tURL: ${entity.url} \tData Source: ${entity.dataSource}`);
console.log(`\tMatches:`)
entity.matches.forEach(match => {
console.log(`\t\tText: ${match.text} \tScore: ${match.confidenceScore.toFixed(2)}`);
})
});
});
}
linkedEntityRecognition(textAnalyticsClient);
```
Ejecute el código con `node index.js` en la ventana de la consola.
### <a name="output"></a>Output
```console
Document ID: 0
Name: Altair 8800 ID: Altair 8800 URL: https://en.wikipedia.org/wiki/Altair_8800 Data Source: Wikipedia
Matches:
Text: Altair 8800 Score: 0.88
Name: Bill Gates ID: Bill Gates URL: https://en.wikipedia.org/wiki/Bill_Gates Data Source: Wikipedia
Matches:
Text: Bill Gates Score: 0.63
Text: Gates Score: 0.63
Name: Paul Allen ID: Paul Allen URL: https://en.wikipedia.org/wiki/Paul_Allen Data Source: Wikipedia
Matches:
Text: Paul Allen Score: 0.60
Name: Microsoft ID: Microsoft URL: https://en.wikipedia.org/wiki/Microsoft Data Source: Wikipedia
Matches:
Text: Microsoft Score: 0.55
Text: Microsoft Score: 0.55
Name: April 4 ID: April 4 URL: https://en.wikipedia.org/wiki/April_4 Data Source: Wikipedia
Matches:
Text: April 4 Score: 0.32
Name: BASIC ID: BASIC URL: https://en.wikipedia.org/wiki/BASIC Data Source: Wikipedia
Matches:
Text: BASIC Score: 0.33
```
---
## <a name="key-phrase-extraction"></a>Extracción de la frase clave
# <a name="version-31-preview"></a>[Versión 3.1 (versión preliminar)](#tab/version-3-1)
Cree una matriz de cadenas que contenga el documento que quiere analizar. Llame al método `extractKeyPhrases()` del cliente y obtenga el objeto `ExtractKeyPhrasesResult` devuelto. Itere los resultados e imprima el identificador de cada documento y todas las frases clave detectadas.
```javascript
async function keyPhraseExtraction(client){
const keyPhrasesInput = [
"My cat might need to see a veterinarian.",
];
const keyPhraseResult = await client.extractKeyPhrases(keyPhrasesInput);
keyPhraseResult.forEach(document => {
console.log(`ID: ${document.id}`);
console.log(`\tDocument Key Phrases: ${document.keyPhrases}`);
});
}
keyPhraseExtraction(textAnalyticsClient);
```
Ejecute el código con `node index.js` en la ventana de la consola.
### <a name="output"></a>Output
```console
ID: 0
Document Key Phrases: cat,veterinarian
```
# <a name="version-30"></a>[Versión 3.0](#tab/version-3)
Cree una matriz de cadenas que contenga el documento que quiere analizar. Llame al método `extractKeyPhrases()` del cliente y obtenga el objeto `ExtractKeyPhrasesResult` devuelto. Itere los resultados e imprima el identificador de cada documento y todas las frases clave detectadas.
```javascript
async function keyPhraseExtraction(client){
const keyPhrasesInput = [
"My cat might need to see a veterinarian.",
];
const keyPhraseResult = await client.extractKeyPhrases(keyPhrasesInput);
keyPhraseResult.forEach(document => {
console.log(`ID: ${document.id}`);
console.log(`\tDocument Key Phrases: ${document.keyPhrases}`);
});
}
keyPhraseExtraction(textAnalyticsClient);
```
Ejecute el código con `node index.js` en la ventana de la consola.
### <a name="output"></a>Output
```console
ID: 0
Document Key Phrases: cat,veterinarian
```
---
## <a name="use-the-api-asynchronously-with-the-analyze-operation"></a>Uso de la API de forma asincrónica con la operación Analyze
# <a name="version-31-preview"></a>[Versión 3.1 (versión preliminar)](#tab/version-3-1)
[!INCLUDE [Analyze Batch Action pricing](../analyze-operation-pricing-caution.md)]
Cree una nueva función llamada `analyze_example()`, que llame a la función `beginAnalyze()`. El resultado será una operación de larga duración que se sondeará para encontrar resultados.
```javascript
async function analyze_example(client) {
const documents = [
"Microsoft was founded by Bill Gates and Paul Allen.",
];
const actions = {
recognizeEntitiesActions: [{ modelVersion: "latest" }],
};
const poller = await client.beginAnalyzeBatchActions(documents, actions, "en");
console.log(
`The analyze batch actions operation was created on ${poller.getOperationState().createdOn}`
);
console.log(
`The analyze batch actions operation results will expire on ${
poller.getOperationState().expiresOn
}`
);
const resultPages = await poller.pollUntilDone();
for await (const page of resultPages) {
const entitiesAction = page.recognizeEntitiesResults[0];
if (!entitiesAction.error) {
for (const doc of entitiesAction.results) {
console.log(`- Document ${doc.id}`);
if (!doc.error) {
console.log("\tEntities:");
for (const entity of doc.entities) {
console.log(`\t- Entity ${entity.text} of type ${entity.category}`);
}
} else {
console.error("\tError:", doc.error);
}
}
}
}
}
analyze_example(textAnalyticsClient);
```
### <a name="output"></a>Output
```console
The analyze batch actions operation was created on Fri Mar 12 2021 09:53:49 GMT-0800 (Pacific Standard Time)
The analyze batch actions operation results will expire on Sat Mar 13 2021 09:53:49 GMT-0800 (Pacific Standard Time)
- Document 0
Entities:
- Entity Microsoft of type Organization
- Entity Bill Gates of type Person
- Entity Paul Allen of type Person
```
También puede usar la operación Analyze para detectar información de identificación personal, para reconocer entidades vinculadas y para la extracción de frases clave. Consulte los ejemplos de Analyze para [JavaScript](https://github.com/Azure/azure-sdk-for-js/tree/master/sdk/textanalytics/ai-text-analytics/samples/v5/javascript) y [TypeScript](https://github.com/Azure/azure-sdk-for-js/tree/master/sdk/textanalytics/ai-text-analytics/samples/v5/typescript/src) en GitHub.
# <a name="version-30"></a>[Versión 3.0](#tab/version-3)
Esta característica no está disponible en la versión 3.0.
---
## <a name="run-the-application"></a>Ejecución de la aplicación
Ejecute la aplicación con el comando `node` en el archivo de inicio rápido.
```console
node index.js
```
| 42.75414 | 596 | 0.685209 | spa_Latn | 0.410952 |
61e130752ce2f5dd8664344f270a1a998dda0dfc | 3,656 | md | Markdown | README.md | hans362/lyrio-ui | c8997656efdf216071f9e151b88df7c6405c1f25 | [
"MIT"
] | 6 | 2021-12-20T05:17:57.000Z | 2022-03-02T05:16:50.000Z | README.md | renbaoshuo/lyrio-ui | a759426ff21a48d6078ac33d04273a010b1af266 | [
"MIT"
] | 2 | 2021-11-23T16:34:04.000Z | 2021-11-27T15:12:52.000Z | README.md | hans362/lyrio-ui | c8997656efdf216071f9e151b88df7c6405c1f25 | [
"MIT"
] | 5 | 2021-11-26T06:03:56.000Z | 2021-12-05T00:58:46.000Z | # Lyrio UI
[](https://github.com/lyrio-dev/ui/actions?query=workflow%3ABuild)
[](https://david-dm.org/lyrio-dev/ui)
[](http://commitizen.github.io/cz-cli/)
[](https://github.com/prettier/prettier)
[](LICENSE)
[](https://www.jsdelivr.com/package/npm/@lyrio/ui)
The web frontend of Lyrio.
# Development
Clone this git repo and install dependencies:
```bash
$ git clone [email protected]:lyrio-dev/ui.git lyrio-ui
$ cd lyrio-ui
$ yarn
```
By default this app listens on `0.0.0.0:3000`, you can change this with the environment variables `PORT` and `HOST`. You can use nginx as reversed proxy to access the app with a domain name like `lyrio-ui.test`.
Start [lyrio](https://github.com/lyrio-dev/lyrio) API server. For example, if the API server in accessible on `http://lyrio.test`, the API endpoint is actually `http://lyrio.test` (without `/api`).
* If the API endpoint is not the same as the lyrio-ui's root url, you should replace the `__api_endpoint__` string in lyrio-ui's HTML (e.g. with Nginx's `ngx_http_sub_module` module) with the API endpoint (in the form of JS expression, e.g. `"http://lyrio.test"`).
* To change the initial title of the page, replace `__default_title__`.
* To load compiled frontend resources from another host, replace `__public_path__`.
* To change the favicon, replace `__favicon__`.
All these replacements work in development or production environment.
Here's a Nginx development configuration file for reference (don't forget to add the `.test` domains to your `hosts` or local DNS server):
```nginx
map $http_upgrade $connection_upgrade {
default upgrade;
'' close;
}
server {
server_name lyrio-ui.test;
listen 80;
location / {
proxy_read_timeout 300s;
proxy_send_timeout 300s;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection $connection_upgrade;
proxy_set_header Accept-Encoding "";
sub_filter '__default_title__' '"Default Title"';
sub_filter '__api_endpoint__' '"http://lyrio.test"';
sub_filter_once on;
proxy_pass http://127.0.0.1:3000;
}
}
server {
server_name lyrio.test;
listen 80;
location / {
proxy_read_timeout 300s;
proxy_send_timeout 300s;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection $connection_upgrade;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_pass http://127.0.0.1:2002;
}
}
```
If you run API server and the frontend app server on different [origins](https://developer.mozilla.org/en-US/docs/Web/HTTP/CORS) like me, you should enable `crossOrigin` in this server's config and configure the API server to white list this server's origin. For example, if you access this server on `http://lyrio-ui.test`:
```yaml
security:
crossOrigin:
enabled: true
whiteList:
- http://lyrio-ui.test
```
Start the development server:
```bash
$ yarn start
```
Wait for Vite to finish compilation and the development server to start, then open `http://lyrio-ui.test`.
| 38.893617 | 324 | 0.722101 | eng_Latn | 0.707536 |
61e16864115ebc28a4054dc5e25d82c9f2de26fc | 1,618 | md | Markdown | docs/ado/reference/ado-md-api/position-object-ado-md.md | strikersree/sql-docs | 9ece10c2970a4f0812647149d3de2c6b75713e14 | [
"CC-BY-4.0",
"MIT"
] | 2 | 2019-02-08T05:59:39.000Z | 2019-02-12T03:27:49.000Z | docs/ado/reference/ado-md-api/position-object-ado-md.md | strikersree/sql-docs | 9ece10c2970a4f0812647149d3de2c6b75713e14 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/ado/reference/ado-md-api/position-object-ado-md.md | strikersree/sql-docs | 9ece10c2970a4f0812647149d3de2c6b75713e14 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2019-04-02T15:42:37.000Z | 2019-04-02T15:42:37.000Z | ---
title: "Position Object (ADO MD) | Microsoft Docs"
ms.prod: sql
ms.prod_service: connectivity
ms.technology: connectivity
ms.custom: ""
ms.date: "01/19/2017"
ms.reviewer: ""
ms.topic: conceptual
apitype: "COM"
f1_keywords:
- "Position"
helpviewer_keywords:
- "Position object [ADO MD]"
ms.assetid: 91eab784-3ce9-41d6-a840-9b0939ca0608
author: MightyPen
ms.author: genemi
manager: craigg
---
# Position Object (ADO MD)
Represents a set of one or more members of different dimensions that defines a point along an axis.
## Remarks
With the properties and collections of a **Position** object you can do the following:
- Use the **Ordinal** property to return the ordinal position of the **Position** along the [Axis](../../../ado/reference/ado-md-api/axis-object-ado-md.md).
- Use the [Members](../../../ado/reference/ado-md-api/members-collection-ado-md.md) collection to return the members that make up the position along the **Axis**.
This section contains the following topic.
- [Properties, Methods, and Events](../../../ado/reference/ado-md-api/position-object-properties-methods-and-events.md)
## See Also
[Axis Example (VBScript)](../../../ado/reference/ado-md-api/axis-example-vbscript.md)
[Axis Object (ADO MD)](../../../ado/reference/ado-md-api/axis-object-ado-md.md)
[Cell Object (ADO MD)](../../../ado/reference/ado-md-api/cell-object-ado-md.md)
[Members Collection (ADO MD)](../../../ado/reference/ado-md-api/members-collection-ado-md.md)
[Positions Collection (ADO MD)](../../../ado/reference/ado-md-api/positions-collection-ado-md.md)
| 40.45 | 166 | 0.697157 | eng_Latn | 0.607787 |
61e16995017afafbac267c00bdffe381b4b7a5c1 | 199 | md | Markdown | README.md | dodoyiaaron/Python-StudyingNotes | e32af7cdf461a4a0b2090c968051cb5bb94349c8 | [
"MIT"
] | null | null | null | README.md | dodoyiaaron/Python-StudyingNotes | e32af7cdf461a4a0b2090c968051cb5bb94349c8 | [
"MIT"
] | null | null | null | README.md | dodoyiaaron/Python-StudyingNotes | e32af7cdf461a4a0b2090c968051cb5bb94349c8 | [
"MIT"
] | null | null | null | # Python-StudyingNotes
bobo edited on 2020-06-08
Python basic knowldge && use pyhton for data analysis
Main contents:
1. Python basic learning notes
2. how to use python for data analysic?
| 24.875 | 61 | 0.778894 | eng_Latn | 0.956483 |
61e36f7cc164165db3ebe112fe7589c880d74e99 | 9,312 | md | Markdown | README.md | fedotov/js-validator-livr | 68cb6ce6947aa655d833ba28dd57a04a2ce45419 | [
"MIT"
] | 147 | 2015-04-27T13:46:28.000Z | 2022-01-10T15:05:47.000Z | README.md | fedotov/js-validator-livr | 68cb6ce6947aa655d833ba28dd57a04a2ce45419 | [
"MIT"
] | 33 | 2015-06-25T14:18:04.000Z | 2022-02-01T10:34:08.000Z | README.md | fedotov/js-validator-livr | 68cb6ce6947aa655d833ba28dd57a04a2ce45419 | [
"MIT"
] | 27 | 2015-01-12T13:49:10.000Z | 2021-08-23T01:20:58.000Z | # LIVR Validator
LIVR.Validator - Lightweight JavaScript validator supporting Language Independent Validation Rules Specification (LIVR).
[](https://badge.fury.io/js/livr)
[](https://travis-ci.org/koorchik/js-validator-livr)
[](https://snyk.io/test/github/koorchik/js-validator-livr?targetFile=package.json)
# SYNOPSIS
Common usage:
```javascript
const LIVR = require('livr');
LIVR.Validator.defaultAutoTrim(true);
const validator = new LIVR.Validator({
name: 'required',
email: ['required', 'email'],
gender: { one_of: ['male', 'female'] },
phone: { max_length: 10 },
password: ['required', { min_length: 10 }],
password2: { equal_to_field: 'password' }
});
const validData = validator.validate(userData);
if (validData) {
saveUser(validData);
} else {
console.log('errors', validator.getErrors());
}
```
You can use modifiers separately or can combine them with validation:
```javascript
const validator = new LIVR.Validator({
email: ['required', 'trim', 'email', 'to_lc']
});
```
Feel free to register your own rules:
You can use aliases(preferable, syntax covered by the specification) for a lot of cases:
```javascript
const validator = new LIVR.Validator({
password: ['required', 'strong_password']
});
validator.registerAliasedRule({
name: 'strong_password',
rules: { min_length: 6 },
error: 'WEAK_PASSWORD'
});
```
Or you can write more sophisticated rules directly:
```javascript
const validator = new LIVR.Validator({
password: ['required', 'strong_password']
});
validator.registerRules({
strong_password() {
return value => {
// We already have "required" rule to check that the value is present
if (value === undefined || value === null || value === '') return;
if (value.length < 6) {
return 'WEAK_PASSWORD';
}
};
}
});
```
If you use LIVR in browser, you can import only the rules you use (it can reduce budle size a little bit):
```javascript
const Validator = require('livr/lib/Validator');
Validator.registerDefaultRules({
required: require('livr/lib/rules/common/required'),
email: require('livr/lib/rules/special/email'),
one_of: require('livr/lib/rules/string/one_of'),
min_length: require('livr/lib/rules/string/min_length'),
max_length: require('livr/lib/rules/string/max_length'),
equal_to_field: require('livr/lib/rules/special/equal_to_field')
});
Validator.defaultAutoTrim(true);
// Anywhere in your app
const Validator = require('livr/lib/Validator');
const validator = new Validator({
name: 'required',
email: ['required', 'email'],
gender: { one_of: ['male', 'female'] },
phone: { max_length: 10 },
password: ['required', { min_length: 10 }],
password2: { equal_to_field: 'password' }
});
const validData = validator.validate(userData);
if (validData) {
saveUser(validData);
} else {
console.log('errors', validator.getErrors());
}
```
# DESCRIPTION
See **[LIVR Specification and rules documentation](http://livr-spec.org)** for detailed documentation and list of supported rules.
**Features:**
- Rules are declarative and language independent
- Any number of rules for each field
- Return together errors for all fields
- Excludes all fields that do not have validation rules described
- Has possibility to validate complex hierarchical structures
- Easy to describe and understand rules
- Returns understandable error codes(not error messages)
- Easy to add own rules
- Rules are be able to change results output ("trim", "nested_object", for example)
- Multipurpose (user input validation, configs validation, contracts programming etc)
**JavaScript version extra features:**
- Zero dependencies
- Works in NodeJs and in a browser
- Validator (without rules) less than 1KB (min+gzip)
- Validator with all rules 2.84KB (min+gzip)
- **You can find more rules in [livr-extra-rules](https://www.npmjs.com/package/livr-extra-rules)**
# INSTALL
#### nodejs/npm
```bash
npm install livr
```
#### Browser (if you do not use npm)
You can find prebuilt browser versions in "dist" folder (development/main.js - not minified development version with source maps, production/main.js - minified production version). Possible you will need some polyfills ("isInteger" etc) for older browsers.
# CLASS METHODS
## new LIVR.Validator(livr, isAutoTrim);
Constructor creates validator objects.
livr - validations rules. Rules description is available here - https://github.com/koorchik/LIVR
isAutoTrim - asks validator to trim all values before validation. Output will be also trimmed.
if isAutoTrim is undefined(or null) than defaultAutoTrim value will be used.
## LIVR.Validator.registerAliasedDefaultRule(alias)
alias - is a plain javascript object that contains: name, rules, error (optional).
```javascript
LIVR.Validator.registerAliasedDefaultRule({
name: 'valid_address',
rules: {
nested_object: {
country: 'required',
city: 'required',
zip: 'positive_integer'
}
}
});
```
Then you can use "valid_address" for validation:
```javascript
{
address: 'valid_address';
}
```
You can register aliases with own errors:
```javascript
LIVR.Validator.registerAliasedDefaultRule({
name: 'adult_age'
rules: [ 'positive_integer', { min_number: 18 } ],
error: 'WRONG_AGE'
});
```
All rules/aliases for the validator are equal. The validator does not distinguish "required", "list_of_different_objects" and "trim" rules. So, you can extend validator with any rules/alias you like.
## LIVR.Validator.registerDefaultRules({"rule_name": ruleBuilder })
ruleBuilder - is a function reference which will be called for building single rule validator.
```javascript
LIVR.Validator.registerDefaultRules({
my_rule(arg1, arg2, arg3, ruleBuilders) {
// ruleBuilders - are rules from original validator
// to allow you create new validator with all supported rules
// const validator = new LIVR.Validator(livr).registerRules(ruleBuilders).prepare();
return (value, allValues, outputArr) => {
if (notValid) {
return 'SOME_ERROR_CODE';
} else {
}
};
}
});
```
Then you can use "my_rule" for validation:
```javascript
{
name1: 'my_rule' // Call without parameters
name2: { 'my_rule': arg1 } // Call with one parameter.
name3: { 'my_rule': [arg1] } // Call with one parameter.
name4: { 'my_rule': [ arg1, arg2, arg3 ] } // Call with many parameters.
}
```
Here is "max_number" implemenation:
```javascript
function maxNumber(maxNumber) {
return value => {
// We do not validate empty fields. We have "required" rule for this purpose
if (value === undefined || value === null || value === '') return;
// return error message
if (value > maxNumber) return 'TOO_HIGH';
};
}
LIVR.Validator.registerDefaultRules({ max_number: maxNumber });
```
All rules for the validator are equal. The validator does not distinguish "required", "list_of_different_objects" and "trim" rules. So, you can extend validator with any rules you like.
## LIVR.Validator.getDefaultRules();
returns object containing all default ruleBuilders for the validator. You can register new rule or update existing one with "registerRules" method.
## LIVR.Validator.defaultAutoTrim(isAutoTrim)
Enables or disables automatic trim for input data. If is on then every new validator instance will have auto trim option enabled
## LIVR.util
List of useful utils for writing your rules (see source code)
# OBJECT METHODS
## validator.validate(input)
Validates user input. On success returns validData (contains only data that has described validation rules). On error return false.
```javascript
const validData = validator.validate(input);
if (validData) {
// use validData
} else {
const errors = validator.getErrors();
}
```
## validator.getErrors()
Returns errors object.
```javascript
{
"field1": "ERROR_CODE",
"field2": "ERROR_CODE",
...
}
```
For example:
```javascript
{
"country": "NOT_ALLOWED_VALUE",
"zip": "NOT_POSITIVE_INTEGER",
"street": "REQUIRED",
"building": "NOT_POSITIVE_INTEGER"
}
```
## validator.registerRules({"rule_name": ruleBuilder})
ruleBuilder - is a function reference which will be called for building single rule validator.
See "LIVR.Validator.registerDefaultRules" for rules examples.
## validator.registerAliasedRule(alias)
alias - is a composite validation rule.
See "LIVR.Validator.registerAliasedDefaultRule" for rules examples.
## validator.getRules()
returns object containing all ruleBuilders for the validator. You can register new rule or update existing one with "registerRules" method.
# AUTHOR
koorchik (Viktor Turskyi)
# Contributors
eNdiD
# BUGS
Please report any bugs or feature requests to Github https://github.com/koorchik/js-validator-livr
| 28.303951 | 256 | 0.699742 | eng_Latn | 0.75725 |
61e379d898c8e3080f3319527722ee22d94230c6 | 1,407 | md | Markdown | 2020/08/05/2020-08-05 01:40.md | zhzhzhy/WeiBoHot_history | 32ce4800e63f26384abb17d43e308452c537c902 | [
"MIT"
] | 3 | 2020-07-14T14:54:15.000Z | 2020-08-21T06:48:24.000Z | 2020/08/05/2020-08-05 01:40.md | zhzhzhy/WeiBoHot_history | 32ce4800e63f26384abb17d43e308452c537c902 | [
"MIT"
] | null | null | null | 2020/08/05/2020-08-05 01:40.md | zhzhzhy/WeiBoHot_history | 32ce4800e63f26384abb17d43e308452c537c902 | [
"MIT"
] | null | null | null | 2020年08月05日01时数据
Status: 200
1.黎巴嫩首都突发爆炸
微博热度:1495794
2.黄圣依哭泣时也要插兜
微博热度:621674
3.讲述少年真我故事
微博热度:616575
4.前妻谈张玉环故意杀人案改判无罪
微博热度:616362
5.男朋友可以有多用心
微博热度:507672
6.警方通报南京女大学生在云南失联
微博热度:453193
7.关晓彤霸气臭脸妆
微博热度:364953
8.CBA
微博热度:361271
9.头顶牛奶游泳一滴不洒
微博热度:359789
10.YIBO-OFFICIAL
微博热度:358920
11.经纪人回应8岁童星当评委
微博热度:306307
12.钟晓芹美人鱼表白
微博热度:272912
13.吴邪看刘丧微博
微博热度:219511
14.关俊彦确诊新冠
微博热度:206584
15.张玉环出狱后首次发声
微博热度:205147
16.高中校长通知书上手写702个新生姓名
微博热度:205066
17.王一博
微博热度:204205
18.喜欢林有有的原因
微博热度:203588
19.赵睿
微博热度:203107
20.千万不要让家人学会拍一拍
微博热度:202709
21.中国冰淇淋市场规模稳居全球第一
微博热度:202405
22.赵又廷探班林更新
微博热度:198445
23.口罩挂链
微博热度:193902
24.中奖后要不要告诉身边人
微博热度:179440
25.重启
微博热度:171554
26.防疫人员抱着冰块降温
微博热度:168998
27.二十不惑预告
微博热度:149960
28.穿越火线
微博热度:144530
29.外交部回应美对中国记者签证限制
微博热度:142923
30.LOFTER
微博热度:141804
31.千万不要让猫经常和狗玩
微博热度:137063
32.中国传媒大学博士学制延长为4年
微博热度:126692
33.综艺里最热闹的场面
微博热度:118725
34.粤港澳大湾区城际铁路建设规划获批
微博热度:117342
35.新冠肺炎相关治疗用药拟纳入医保目录
微博热度:116304
36.糖豆人
微博热度:113088
37.向佐将赴台陪郭碧婷待产
微博热度:104499
38.哈尔滨仓库坍塌已致4人遇难
微博热度:98330
39.张艺兴帽子戏法教程
微博热度:95057
40.曾可妮给林书豪打call
微博热度:90041
41.张玉环案或申请约700万国家赔偿
微博热度:87968
42.彩虹爆浆巨无霸吐司
微博热度:84684
43.关于夏天的美好瞬间
微博热度:83299
44.鲸鱼母子主动与游客嬉戏
微博热度:81882
45.新疆此次疫情来源于同一传染源
微博热度:74403
46.胡明轩三分
微博热度:72440
47.去和很久不见的朋友拥抱吧
微博热度:72288
48.大学生研发证件照自动上妆
微博热度:64629
49.张萌 林有有这样的角色我不会演
微博热度:63405
50.东京奥运会或将闭门举行
微博热度:63189
| 6.897059 | 21 | 0.781805 | yue_Hant | 0.396722 |
61e3b539c1785a14b5ca17c2666e84c60687a17b | 50 | md | Markdown | README.md | johnny-stene/libre-coffee | 3384eb5a643c7b41bdf848a87ed01bb1fe890fb6 | [
"MIT"
] | null | null | null | README.md | johnny-stene/libre-coffee | 3384eb5a643c7b41bdf848a87ed01bb1fe890fb6 | [
"MIT"
] | null | null | null | README.md | johnny-stene/libre-coffee | 3384eb5a643c7b41bdf848a87ed01bb1fe890fb6 | [
"MIT"
] | null | null | null | # libre-coffee
Free and Open Source Coffee Maker.
| 16.666667 | 34 | 0.78 | eng_Latn | 0.733602 |
61e3c978c745faf3a4abf376aa4d3dc252bb212f | 739 | md | Markdown | packages/docs/developer-guide/sdk-code-reference/summary-1/modules/_types_.celoparams.md | medhak1/celo-monorepo | 42b9edf1dc433cc3828d179387a8e239e16b7290 | [
"Apache-2.0"
] | null | null | null | packages/docs/developer-guide/sdk-code-reference/summary-1/modules/_types_.celoparams.md | medhak1/celo-monorepo | 42b9edf1dc433cc3828d179387a8e239e16b7290 | [
"Apache-2.0"
] | null | null | null | packages/docs/developer-guide/sdk-code-reference/summary-1/modules/_types_.celoparams.md | medhak1/celo-monorepo | 42b9edf1dc433cc3828d179387a8e239e16b7290 | [
"Apache-2.0"
] | null | null | null | # CeloParams
## Hierarchy
* **CeloParams**
## Index
### Properties
* [feeCurrency]()
* [gatewayFee]()
* [gatewayFeeRecipient]()
## Properties
### feeCurrency
• **feeCurrency**: _string_
_Defined in_ [_packages/sdk/connect/src/types.ts:7_](https://github.com/celo-org/celo-monorepo/blob/master/packages/sdk/connect/src/types.ts#L7)
### gatewayFee
• **gatewayFee**: _string_
_Defined in_ [_packages/sdk/connect/src/types.ts:9_](https://github.com/celo-org/celo-monorepo/blob/master/packages/sdk/connect/src/types.ts#L9)
### gatewayFeeRecipient
• **gatewayFeeRecipient**: _string_
_Defined in_ [_packages/sdk/connect/src/types.ts:8_](https://github.com/celo-org/celo-monorepo/blob/master/packages/sdk/connect/src/types.ts#L8)
| 21.114286 | 144 | 0.734777 | eng_Latn | 0.204968 |
61e469fc3fc9df0dd19229a818796ae9f4596c4f | 3,085 | md | Markdown | README.md | Sudolphus/sudo-games-site | dcfaf3193bcfb948c8c33e10a86c854f80c0ce95 | [
"MIT"
] | null | null | null | README.md | Sudolphus/sudo-games-site | dcfaf3193bcfb948c8c33e10a86c854f80c0ce95 | [
"MIT"
] | null | null | null | README.md | Sudolphus/sudo-games-site | dcfaf3193bcfb948c8c33e10a86c854f80c0ce95 | [
"MIT"
] | null | null | null | <p align="center">
<a href="" rel="noopener">
<img width=640px height=426px src="./card-logo.jpg" alt="Project logo"></a>
</p>
<h3 align="center">Sudo Game Site</h3>
<div align="center">
[]()
[](https://github.com/Sudolphus/sudo-games-site/issues)
[](https://github.com/Sudolphus/sudo-games-site/pulls)
[](/LICENSE)
</div>
---
<p align="center"> A site for playing some games.
<br>
</p>
## 📝 Table of Contents
- [About](#about)
- [Getting Started](#getting_started)
- [Deployment](#deployment)
- [Usage](#usage)
- [Built Using](#built_using)
- [Authors](#authors)
- [Acknowledgments](#acknowledgement)
## 🧐 About <a name = "about"></a>
I love playing games, so I decided to build a site to play them at.
## 🏁 Getting Started <a name = "getting_started"></a>
These instructions will get you a copy of the project up and running on your local machine for development and testing purposes. See [deployment](#deployment) for notes on how to deploy the project on a live system.
### Prerequisites
1. A code editor
2. [Node.js](https://nodejs.org)
3. A browser
4. A terminal application, such as [Git Bash](https://git-scm.com/downloads)
5. A package manager, such as [npm](https://www.npmjs.com/) or [yarn](https://yarnpkg.com/)
### Installing
1. Download the repo to the directory of your choice, by either clicking the download button, or run `git clone https://github.com/Sudolphus/sudo-games-site.git`
2. Open with the code editor of your choice, and edit to your heart's content.
3. To run the code, navigate into the root directory with your terminal application and acquire the packages with `npm install` or `yarn install`, depending on your package manager.
4. A development build can be started with `npm run dev` or `yarn dev`, which can them be viewed in your browser at `localhost:3000`
5. A production build can be created with `npm run build` or `yarn build`, which will create a public/ folder with optimized code bundled by webpack.
## 🎈 Usage <a name="usage"></a>
Once the system is installed and online, it should be as easy as opening the page in your browser.
## ⛏️ Built Using <a name = "built_using"></a>
- [NodeJs](https://nodejs.org/en/) - Server Environment
- [ReactJs](https://reactjs.org/) - Frontend Interface
- [NextJs](https://nextjs.org/) - React Platform
## ✍️ Author <a name = "authors"></a>
- [Sudolphus](https://github.com/Sudolphus) - Idea & Initial work
## 🎉 Acknowledgements <a name = "acknowledgement"></a>
- <span>Project Logo by <a href="https://unsplash.com/@amandagraphc?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText">Amanda Jones</a> on <a href="https://unsplash.com/s/photos/card-game?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText">Unsplash</a></span>
| 41.689189 | 308 | 0.722528 | eng_Latn | 0.861073 |
61e4e796b8fc11f40e0a343dd3f888a96236ffcd | 23 | md | Markdown | README.md | IOTA-Ledger/ledger-app-iota-tests | 72dfd97cdd4d2b7e7271f7b6b19ea858ac63feda | [
"MIT"
] | 1 | 2021-04-15T20:16:24.000Z | 2021-04-15T20:16:24.000Z | README.md | IOTA-Ledger/ledger-app-iota-tests | 72dfd97cdd4d2b7e7271f7b6b19ea858ac63feda | [
"MIT"
] | null | null | null | README.md | IOTA-Ledger/ledger-app-iota-tests | 72dfd97cdd4d2b7e7271f7b6b19ea858ac63feda | [
"MIT"
] | 1 | 2019-11-29T09:39:38.000Z | 2019-11-29T09:39:38.000Z | # ledger-app-iota-tests | 23 | 23 | 0.782609 | swe_Latn | 0.35268 |
61e509fb6fe1dd2389aac844d251fb172eaebe66 | 88 | md | Markdown | markdown/org/docs/patterns/carlton/options/nl.md | TriploidTree/freesewing | 428507c6682d49a0869a13ce56b4a38d844a8e5c | [
"MIT"
] | null | null | null | markdown/org/docs/patterns/carlton/options/nl.md | TriploidTree/freesewing | 428507c6682d49a0869a13ce56b4a38d844a8e5c | [
"MIT"
] | 2 | 2022-02-04T13:28:21.000Z | 2022-02-04T14:07:38.000Z | markdown/org/docs/patterns/carlton/options/nl.md | SeaZeeZee/freesewing | 8589e7b7ceeabd738c4b69ac59980acbebfea537 | [
"MIT"
] | null | null | null | - - -
title: "Carlton coat: Design Options"
- - -
<PatternOptions pattern='carlton' />
| 14.666667 | 37 | 0.647727 | kor_Hang | 0.412549 |
61e54a11df61f4c6cd25706c1943faca4f1dc6b9 | 129 | md | Markdown | ahhhhhhhhhh/Library/PackageCache/[email protected]/Documentation~/edit-mode-vs-play-mode-tests.md | kenx00x/ahhhhhhhhhhhh | e3610c77f11bb0c142ee422ce8b9aad4f99d12b5 | [
"MIT"
] | null | null | null | ahhhhhhhhhh/Library/PackageCache/[email protected]/Documentation~/edit-mode-vs-play-mode-tests.md | kenx00x/ahhhhhhhhhhhh | e3610c77f11bb0c142ee422ce8b9aad4f99d12b5 | [
"MIT"
] | null | null | null | ahhhhhhhhhh/Library/PackageCache/[email protected]/Documentation~/edit-mode-vs-play-mode-tests.md | kenx00x/ahhhhhhhhhhhh | e3610c77f11bb0c142ee422ce8b9aad4f99d12b5 | [
"MIT"
] | null | null | null | version https://git-lfs.github.com/spec/v1
oid sha256:8fd06976aa504ee550313b1f77532f1afdb94e431915753f4cfb365707a94488
size 2724
| 32.25 | 75 | 0.883721 | kor_Hang | 0.148447 |
61e5a64a68aa6541494da1ed80d575e6a8c67075 | 4,527 | md | Markdown | model_zoo/official/cv/nasnet/README_CN.md | LottieWang/mindspore | 1331c7e432fb691d1cfa625ab7cc7451dcfc7ce0 | [
"Apache-2.0"
] | null | null | null | model_zoo/official/cv/nasnet/README_CN.md | LottieWang/mindspore | 1331c7e432fb691d1cfa625ab7cc7451dcfc7ce0 | [
"Apache-2.0"
] | null | null | null | model_zoo/official/cv/nasnet/README_CN.md | LottieWang/mindspore | 1331c7e432fb691d1cfa625ab7cc7451dcfc7ce0 | [
"Apache-2.0"
] | null | null | null | # 目录
<!-- TOC -->
- [目录](#目录)
- [NASNet概述](#NASNet概述)
- [模型架构](#模型架构)
- [数据集](#数据集)
- [环境要求](#环境要求)
- [脚本说明](#脚本说明)
- [脚本和样例代码](#脚本和样例代码)
- [脚本参数](#脚本参数)
- [训练过程](#训练过程)
- [评估过程](#评估过程)
- [模型描述](#模型描述)
- [性能](#性能)
- [训练性能](#训练性能)
- [评估性能](#评估性能)
- [ModelZoo主页](#modelzoo主页)
<!-- /TOC -->
# NASNet概述
[论文](https://arxiv.org/abs/1707.07012): Barret Zoph, Vijay Vasudevan, Jonathon Shlens, Quoc V. Le. Learning Transferable Architectures for Scalable Image Recognition. 2017.
# 模型架构
NASNet总体网络架构如下:
[链接](https://arxiv.org/abs/1707.07012)
# 数据集
使用的数据集:[imagenet](http://www.image-net.org/)
- 数据集大小:125G,共1000个类、1.2万张彩色图像
- 训练集:120G,共1.2万张图像
- 测试集:5G,共5万张图像
- 数据格式:RGB
* 注:数据在src/dataset.py中处理。
# 环境要求
- 硬件:GPU
- 使用GPU处理器来搭建硬件环境。
- 框架
- [MindSpore](https://www.mindspore.cn/install)
- 如需查看详情,请参见如下资源:
- [MindSpore教程](https://www.mindspore.cn/tutorials/zh-CN/master/index.html)
- [MindSpore Python API](https://www.mindspore.cn/docs/api/zh-CN/master/index.html)
# 脚本说明
## 脚本及样例代码
```python
.
└─nasnet
├─README.md
├─README_CN.md
├─scripts
├─run_standalone_train_for_gpu.sh # 使用GPU平台启动单机训练(单卡)
├─run_distribute_train_for_gpu.sh # 使用GPU平台启动分布式训练(8卡)
└─run_eval_for_gpu.sh # 使用GPU平台进行启动评估
├─src
├─config.py # 参数配置
├─dataset.py # 数据预处理
├─loss.py # 自定义交叉熵损失函数
├─lr_generator.py # 学习率生成器
├─nasnet_a_mobile.py # 网络定义
├─eval.py # 评估网络
├─export.py # 转换检查点
└─train.py # 训练网络
```
## 脚本参数
在config.py中可以同时配置训练参数和评估参数。
```python
'random_seed':1, # 固定随机种子
'rank':0, # 分布式训练进程序号
'group_size':1, # 分布式训练分组大小
'work_nums':8, # 数据读取人员数
'epoch_size':500, # 总周期数
'keep_checkpoint_max':100, # 保存检查点最大数
'ckpt_path':'./checkpoint/', # 检查点保存路径
'is_save_on_master':1 # 在rank0上保存检查点,分布式参数
'batch_size':32, # 输入批次大小
'num_classes':1000, # 数据集类数
'label_smooth_factor':0.1, # 标签平滑因子
'aux_factor':0.4, # 副对数损失系数
'lr_init':0.04, # 启动学习率
'lr_decay_rate':0.97, # 学习率衰减率
'num_epoch_per_decay':2.4, # 衰减周期数
'weight_decay':0.00004, # 重量衰减
'momentum':0.9, # 动量
'opt_eps':1.0, # epsilon参数
'rmsprop_decay':0.9, # rmsprop衰减
'loss_scale':1, # 损失规模
```
## 训练过程
### 用法
```bash
# 分布式训练示例(8卡)
bash run_distribute_train_for_gpu.sh DATA_DIR
# 单机训练
bash run_standalone_train_for_gpu.sh DEVICE_ID DATA_DIR
```
### 运行
```bash
# GPU分布式训练示例(8卡)
bash scripts/run_distribute_train_for_gpu.sh /dataset/train
# GPU单机训练示例
bash scripts/run_standalone_train_for_gpu.sh 0 /dataset/train
```
### 结果
可以在日志中找到检查点文件及结果。
## 评估过程
### 用法
```bash
# 评估
bash run_eval_for_gpu.sh DEVICE_ID DATA_DIR PATH_CHECKPOINT
```
### 启动
```bash
# 检查点评估
bash scripts/run_eval_for_gpu.sh 0 /dataset/val ./checkpoint/nasnet-a-mobile-rank0-248_10009.ckpt
```
> 训练过程中可以生成检查点。
### 结果
评估结果保存在脚本路径下。路径下的日志中,可以找到如下结果:
acc=73.5%(TOP1)
# 模型描述
## 性能
### 训练性能
| 参数 | NASNet |
| -------------------------- | ------------------------- |
| 资源 | NV SMX2 V100-32G |
| 上传日期 | 2020-09-24 |
| MindSpore版本 | 1.0.0 |
| 数据集 | ImageNet |
| 训练参数 | src/config.py |
| 优化器 | Momentum |
| 损失函数 | SoftmaxCrossEntropyWithLogits |
| 损失值 | 1.8965 |
| 总时间 | 8卡运行约144个小时 |
| 检查点文件大小 | 89 M(.ckpt文件) |
### 评估性能
| 参数 | |
| -------------------------- | ------------------------- |
| 资源 | NV SMX2 V100-32G |
| 上传日期 | 2020-09-24 |
| MindSpore版本 | 1.0.0 |
| 数据及 | ImageNet, 1.2W |
| batch_size | 32 |
| 输出 | 概率 |
| 精确度 | acc=73.5%(TOP1) |
# ModelZoo主页
请浏览官网[主页](https://gitee.com/mindspore/mindspore/tree/master/model_zoo)。
| 24.208556 | 172 | 0.481555 | yue_Hant | 0.455811 |
61e5c30d3e571d631dbb7fc61803d8b1f02461dd | 6,516 | md | Markdown | _posts/2021-10-16-narappa-2021-full-online-pdsik-movies.md | pdiskmovies/pdiskmovies.github.io | 99083af56af8d8bfc9301b265df8e4fb97fff2c2 | [
"MIT"
] | null | null | null | _posts/2021-10-16-narappa-2021-full-online-pdsik-movies.md | pdiskmovies/pdiskmovies.github.io | 99083af56af8d8bfc9301b265df8e4fb97fff2c2 | [
"MIT"
] | null | null | null | _posts/2021-10-16-narappa-2021-full-online-pdsik-movies.md | pdiskmovies/pdiskmovies.github.io | 99083af56af8d8bfc9301b265df8e4fb97fff2c2 | [
"MIT"
] | null | null | null | ---
id: 3364
title: narappa (2021) full online Pdsik movies
date: 2021-10-16T05:58:10+00:00
author: tentrockers
layout: post
guid: https://tentrockers.online/narappa-2021-full-online-pdsik-movies/
permalink: /narappa-2021-full-online-pdsik-movies/
cyberseo_rss_source:
- 'https://www.pdiskmovies.net/feeds/posts/default?max-results=100&start-index=1101'
cyberseo_post_link:
- https://www.pdiskmovies.net/2021/07/narappa-2021-full-online-pdsik-movies.html
categories:
- Uncategorized
---
<a href="https://www.pdiskmovies.net/p/coming-soon.html" target="popup" onclick="window.open('https://www.pdiskmovies.net/p/coming-soon.html'); return false;" rel="noopener"><br /> Play Movie<br /> </a>
<div class="separator">
<a href="https://1.bp.blogspot.com/-USqJIPQzgS0/YPbRsBDT28I/AAAAAAAAZjg/_HfrrU7H6rMZky262dV-p79u_A5MOVM2ACLcBGAsYHQ/s696/narappa%2B%25282021%2529%2Bfull%2Bonline%2BPdsik%2Bmovies.jpg"><img loading="lazy" border="0" data-original-height="522" data-original-width="696" height="480" src="https://1.bp.blogspot.com/-USqJIPQzgS0/YPbRsBDT28I/AAAAAAAAZjg/_HfrrU7H6rMZky262dV-p79u_A5MOVM2ACLcBGAsYHQ/w640-h480/narappa%2B%25282021%2529%2Bfull%2Bonline%2BPdsik%2Bmovies.jpg" width="640" /></a>
</div>
<div>
<div>
<span>Narappa bears the heaviness of Srikanth Addala retelling an all around fruitful story pulled off by Vetrimaaran with Asuran. It likewise delivers only a couple days after the 36th commemoration of the Karamchedu slaughter which saw six dalits killed, three dalit ladies assaulted and many dislodged by the overlords in the town on July 17. Very much like in the slaughter that stays new on individuals’ brains, a chain of occasions lead to gore, tears and torment in the film. </span>
</div>
<div>
<span>Narappa (Venkatesh Daggubati) is a maturing alcoholic rancher who likes to choose not to retaliate as opposed to retaliate and stand up the persecution him and his friends face in the town. Unbeknownst to his hot-headed children Munikanna (Karthik Ratnam) and Sinappa (Rakhi) he has a justification being how he is. What they likewise don’t know is that he has an agonizing, brutal past and will go to any lengths to guard them. A rich property manager Pandusami (Aadukalam Naren) needs to snatch their three-sections of land of land to set up a concrete manufacturing plant regardless of previously possessing the greater part of the land in the town. At the point when Narappa’s oldest child Munikanna, egged on by his uncle Basavayya (Rajeev Kanakala) and mother Sundaramma (Priyamani), won’t endure the rank divergence in the town, he sets off a chain of occasions that see his dad battle to save the family. </span>
</div>
<div>
<span>In this variation of Poomani’s acclaimed novel Vekkai, Srikanth Addala escapes the safe place of his normally brilliant and cheerful family dramatizations. He remains consistent with the first film Asuran, in any event, venturing to redo it scene-to-scene, outline to-casing and discourse to-exchange. While the word standing is unuttered generally of the film, it gauges weighty on the storyline as well as the manner in which characters are dealt with. It’s unmistakable when a bunch of individuals aren’t permitted admittance to land, water and even footwear that more than ‘class dissimilarity’ (as many like to whitewash it) is at play here. Narappa has lost a ton in daily routine and experiences in steady dread of losing the piece of paradise he has made for himself. He’d prefer figure things out genially than face people pulling the strings and hazard losing everything. </span>
</div>
<div>
<span>Venkatesh shoulders the film generally, feeling open to having his impact as the maturing alcoholic patriarch who is continually disparaged by people around him, including his own family. He empties life into it, particularly in the passionate and battle scenes that request a ton of him. It is anyway odd to see him with the extremely youthful Abhirami. Priyamani, Karthik Ratnam, Rakhi, Rajeev Kanakala and rest of the cast likewise give it their all regardless of whether they waver now and again. Karthik Ratnam and Raajev Kanakala specifically stick out while Nasser and Rao Ramesh feel miscast. The battle scenes arranged by Peter Hein and the recognizable BGM from Asuran made by GV Prakash Kumar loan to Narappa’s frantic bid for the endurance of his family. Mani Sharma’s music also does its touch. </span>
</div>
<div>
<span>Narappa is a natural area for the individuals who have seen Asuran. It probably won’t be pretty much as smooth as the first because of a portion of the exhibitions by the supporting cast, yet it figures out how to convey the idea noisy and clear. The projecting of Venkatesh Daggubati specifically for the lead job may likewise not agree with everybody, appearing to be musically challenged. For the individuals who haven’t seen Vetrimaaran’s work, this is another film like the new Uppena, Color Photo, Palasa 1978, Dorasaani, et al that arrangement with the subject well. Allow it an opportunity on the off chance that you wouldn’t fret watching a film that drives you to take a gander at your own biases.</span>
</div>
</div>
<div>
<span></p>
<div>
<ul>
<li>
narappa amazon prime release date
</li>
<li>
narappa release date on ott
</li>
<li>
narappa release date postponed
</li>
<li>
narappa telugu full movie release date
</li>
<li>
narappa imdb rating
</li>
<li>
narappa movierulz
</li>
<li>
narappa streaming
</li>
<li>
narappa release date
</li>
<li>
narappa movierulz download
</li>
<li>
narappa movie download isaimini
</li>
<li>
narappa telugu movie download movierulz
</li>
<li>
narappa full movie watch online free
</li>
<li>
narappa ott release date
</li>
<li>
narappa full movie movierulz
</li>
<li>
tamilrockers
</li>
</ul>
</div>
<p>
</span></div>
<p>
<a href="https://www.pdiskmovies.net/p/coming-soon.html" target="popup" onclick="window.open('https://www.pdiskmovies.net/p/coming-soon.html'); return false;" rel="noopener"><br /> Play Movie<br /> </a>
</p> | 62.653846 | 953 | 0.729282 | eng_Latn | 0.987883 |
61e623cf8848ff2f7e12c5f80ffc43a52408fd8d | 1,481 | md | Markdown | README.md | OldBigBuddha/fuga.dev | fb4feee71d3293363162092b9a815fa4d5c878da | [
"Apache-2.0"
] | null | null | null | README.md | OldBigBuddha/fuga.dev | fb4feee71d3293363162092b9a815fa4d5c878da | [
"Apache-2.0"
] | 3 | 2019-03-09T16:09:14.000Z | 2019-03-10T10:22:16.000Z | README.md | OldBigBuddha/fuga.dev | fb4feee71d3293363162092b9a815fa4d5c878da | [
"Apache-2.0"
] | 1 | 2019-03-09T15:56:21.000Z | 2019-03-09T15:56:21.000Z | # fuga.dev
[](https://app.netlify.com/sites/fugadev/deploys)
[](https://github.com/OldBigBuddha/fuga.dev)
[](https://github.com/standard/standard)
.dev一般開放に伴ったネタサイト、フガーな命名をされてしまった変数・関数・クラスたちを弔おう。
~~fuga.dev取れませんでした~~ **[@Satopppy_](https://twitter.com/Satopppy_)のお陰で取得できました!**
# Contributing
`develop` ブランチから各自feature単位でブランチを切り、`develop` ブランチへPRを投げてください。
## commit message
このリポジトリ内では以下の書式に従ってください。
また、メッセージは全て英語で記述するものとします。
```
[verb] About commit
<Empty line>
Detail
```
2行目以降は任意です。
`[]` 内の verb には以下の表のうちのどれかでなくてはいけません。
| Verb | 意味 |
| ---- | --- |
| add | 新規のファイル、機能及びモジュール等の追加 |
| update | 既存のファイル、機能及びモジュール等の変更 |
| fix | (報告、未報告とわず)バグを修正 |
| remove | 既存のファイル、機能及びモジュール等の削除 |
| clean | リファクタリング |
コミットメッセージに詳細を含む場合は、 **必ず** 2行目を空行にしてください。
READM.mdのContributorの欄に自分の名前を載せた場合、このようになることを想定しています。
```
[add] Contributor of README.md
Add my name
```
# Contributor
- [OldBigBuddha](https://github.com/OldBigBuddha)
- [KNTK](https://github.com/AkihiroTokai)
追加雛形
```
- [名前](GitHubのURL)
```
# Special Thanks
ドメイン提供者: [@Satopppy_](https://twitter.com/Satopppy_)
# LICENSE
See [./LICENSE](./LICENSE)
This application includes the work that is distributed in the Apache License 2.0
(C) 2019 OldBigBuddha.
| 22.104478 | 156 | 0.736664 | yue_Hant | 0.568879 |
61e66ea5cc4108040fb74f6819f292d85770e94a | 1,034 | md | Markdown | QA/RemoveElement.md | CW0149/leetcode-easy | 5ffbbcadd5526a4b1c54f166f206e78e1a1843e6 | [
"RSA-MD"
] | null | null | null | QA/RemoveElement.md | CW0149/leetcode-easy | 5ffbbcadd5526a4b1c54f166f206e78e1a1843e6 | [
"RSA-MD"
] | null | null | null | QA/RemoveElement.md | CW0149/leetcode-easy | 5ffbbcadd5526a4b1c54f166f206e78e1a1843e6 | [
"RSA-MD"
] | null | null | null | # [移除元素](https://leetcode-cn.com/problems/remove-element)
### 问题
给定一个数组 nums 和一个值 val,你需要原地移除所有数值等于 val 的元素,返回移除后数组的新长度。
不要使用额外的数组空间,你必须在原地修改输入数组并在使用 O(1) 额外空间的条件下完成。
元素的顺序可以改变。你不需要考虑数组中超出新长度后面的元素。
示例 1:
```
给定 nums = [3,2,2,3], val = 3,
函数应该返回新的长度 2, 并且 nums 中的前两个元素均为 2。
你不需要考虑数组中超出新长度后面的元素。
```
示例 2:
```
给定 nums = [0,1,2,2,3,0,4,2], val = 2,
函数应该返回新的长度 5, 并且 nums 中的前五个元素为 0, 1, 3, 0, 4。
注意这五个元素可为任意顺序。
你不需要考虑数组中超出新长度后面的元素。
```
说明:
为什么返回数值是整数,但输出的答案是数组呢?
请注意,输入数组是以“引用”方式传递的,这意味着在函数里修改输入数组对于调用者是可见的。
你可以想象内部操作如下:
// nums 是以“引用”方式传递的。也就是说,不对实参作任何拷贝
int len = removeElement(nums, val);
// 在函数里修改输入数组对于调用者是可见的。
// 根据你的函数返回的长度, 它会打印出数组中该长度范围内的所有元素。
for (int i = 0; i < len; i++) {
print(nums[i]);
}
### 解答
```
/**
* @param {number[]} nums
* @param {number} val
* @return {number}
*/
var removeElement = function(nums, val) {
var i = 0;
for (var j = 0; j < nums.length; j += 1) {
if (nums[j] !== val) {
nums[i] = nums[j];
i += 1;
}
}
return i;
};
```
| 15.205882 | 57 | 0.609284 | yue_Hant | 0.218955 |
61e722c9f6d17b58c18cbe87f0d6b25ab6c8807d | 19,540 | md | Markdown | _posts/2018-01-26-book-building-evolutionary-architectures.md | shoukai/shoukai.github.io | 061b294a50ab6d1ece0080561d4a3b17792a3273 | [
"Apache-2.0"
] | 2 | 2021-02-04T09:07:54.000Z | 2021-12-01T06:26:38.000Z | _posts/2018-01-26-book-building-evolutionary-architectures.md | shoukai/shoukai.github.io | 061b294a50ab6d1ece0080561d4a3b17792a3273 | [
"Apache-2.0"
] | null | null | null | _posts/2018-01-26-book-building-evolutionary-architectures.md | shoukai/shoukai.github.io | 061b294a50ab6d1ece0080561d4a3b17792a3273 | [
"Apache-2.0"
] | 1 | 2021-12-01T06:26:40.000Z | 2021-12-01T06:26:40.000Z | ---
layout: post
title: "读书笔记:演进式架构"
subtitle: "Building Evolutionary Architectures"
date: 2018-01-30 18:00:00
author: "Shoukai Huang"
header-img: 'cdn.apframework.com/f4d6469ceef0c79e8615cd6e722a7770.jpg'
header-mask: 0.4
tags: 读书笔记 架构
---
# Building Evolutionary Architectures
-------
## Chapter 1. Software Architecture
-------
* An initial part of an architect’s job is to understand the business or domain requirements for a proposed solution.
* Here is our **definition of evolutionary architecture**: An evolutionary architecture supports guided, incremental change across multiple dimensions. (进化软件架构定义:演进式体系结构支持跨多个维度的引导性增量变更。)
* There are no separate systems. The world is a continuum. Where to draw a boundary around a system depends on the purpose of the discussion. - Donella H. Meadows
* **Multiple Architectural Dimensions** Here are some common dimensions that affect evolvability in modern software architectures:
* Technical: The implementation parts of the architecture: the frameworks, dependent libraries, and the implementation language( s).
* Data: Database schemas, table layouts, optimization planning, etc. The database administrator generally handles this type of architecture.
* Security: Defines security policies, guidelines, and specifies tools to help uncover deficiencies.
* Operational/System: Concerns how the architecture maps to existing physical and/ or virtual infrastructure: servers, machine clusters, switches, cloud resources, and so on.
* Each of these perspectives forms a dimension of the architecture — an intentional partitioning of the parts supporting a particular perspective.
* An architecture consists of both requirements and other dimensions, each protected by fitness functions
{:height="100%" width="100%"}
* An evolutionary architecture consists of three primary aspects: **incremental change, fitness functions, and appropriate coupling**.
## Chapter 2. Fitness Functions
-------
* The word guided(进化软件架构定义中的指导性) indicates that some objective exists that architecture should move toward or exhibit.
* We use this concept to **define architectural fitness functions**:
* An architectural fitness function provides an objective integrity assessment of some architectural characteristic(s).
* 恰当功能提供了一些架构特征的客观完整性评估
* The systemwide fitness function is crucial for an architecture to be evolutionary, as we need some basis to allow architects to compare and evaluate architectural characteristics against one another.
{:height="100%" width="100%"}
* A system is never the sum of its parts. It is the product of the interactions of its parts. Dr. - Russel Ackoff
* **Identify Fitness Functions Early.** Teams should identify fitness functions as part of their initial understanding of the overall architecture concerns that their design must support. They should also identify their system fitness function early to help determine the sort of change that they want to support.
* Fitness functions can be classified into three simple categories:
* Key(关键): These dimensions are critical in making technology or design choices. More effort should be invested to explore design choices that make change around these elements significantly easier. For example, for a banking application, performance and resiliency are key dimensions.
* Relevant(相关): These dimensions need to be considered at a feature level, but are unlikely to guide architecture choices. For example, code metrics around the quality of code base are important but not key.
* Not Relevant(不相关): Design and technology choices are not impacted by these types of dimensions. For example, process metrics such as cycle time (the amount of time to move from design to implementation, may be important in some ways but is irrelevant to architecture). As a result, fitness functions for it are not necessary.
* Review Fitness Functions A fitness function review is a meeting with key business and technical stakeholders with the goal of updating fitness functions to meet design goals.
## Chapter 3. Engineering Incremental Change
-------
* Our definition of evolutionary architecture implies incremental change, meaning the architecture should facilitate change in small increments. (意味着架构应该通过小的增量来促进变化)
* Combining Fitness Function Categories(结合恰当功能进行分类). Fitness function categories often intersect when implementing them in mechanisms like deployment pipelines.
1. atomic + triggered: This type of fitness function is exemplified by unit and functional tests run as part of software development.
2. holistic + triggered: Holistic, triggered fitness functions are designed to run as part of integration testing via a deployment pipeline.
3. atomic + continual: Continual tests run as part of the architecture, and developers design around their presence.
4. holistic + continual: Holistic, continual fitness functions test multiple parts of the system all the time.
**PenultimateWidgets deployment pipeline**
{:height="100%" width="100%"}
PenultimateWidgets’ deployment pipeline consists
of six stages.
* Stage 1 — Replicating CI The first stage replicates the behavior of the former CI server, running unit, and functional tests.
* Stage 2 — Containerize and Deploy Developers use the second stage to build containers for their service, allowing deeper levels of testing, including deploying the containers to a dynamically created test environment.
* Stage 3 — Atomic Fitness Functions In the third stage atomic fitness functions, including automated scalability tests and security penetration testing, are executed. This
* Stage 4 — Holistic Fitness Functions The fourth stage focuses on holistic fitness functions, including testing contracts to protect integration points and some further scalability tests.
* Stage 5a — Security Review (manual) This stage includes a manual stage by a specific security group within the organization to review, audit, and assess any security vulnerabilities in the codebase.
* Stage 5b — Auditing (manual) PenultimateWidgets is based in Springfield, where the state mandates specific auditing rules.
* Stage 6 — Deployment The last stage is deployment into the production environment.
## Chapter 4. Architectural Coupling
-------
* Discussions about architecture frequently boil down to coupling: how the pieces of the architecture connect and rely on one another(架构的各个部分是如何相互连接和依赖的).
* In evolutionary architecture, architects deal with architectural quanta, the parts of a system held together by hard-to-break forces.
* The relationship between modules, components and quanta
{:height="100%" width="100%"}
* the outermost container is the quantum: the deployable unit that includes all the facilities required for the system to function properly, including data.
* Within the quantum, several components exist, each consisting of code (classes, packages, namespaces, functions, and so on). An external component (from an open source project) also exists as a library, a component packaged for reuse within a given platform. Of course, developers can mix and match all possible combinations of these common building blocks.
**架构风格演变**
分别从Incremental change(增量更改)、Guided change via fitness functions(恰当功能指导变更)和Appropriate coupling(适当耦合)三个方面,分析历史过程中的各个架构风格
#### Big Ball of Mud
1. 增量更改: Making any change in this architecture is difficult.
2. 指导变更: Building fitness functions for this architecture is difficult because no clearly defined partitioning exists.
3. 适当耦合: This architectural style is a good example of inappropriate coupling.
#### Monoliths
1. 增量更改: Large quantum size hinders incremental change because high coupling requires deploying large chunks of the application.
2. 指导变更: Building fitness functions for monoliths is difficult but not impossible.
3. 适当耦合: A monolithic architecture, with little internal structure outside simple classes, exhibits coupling almost as bad as a Big Ball of Mud.
#### Layered architecture
Each layer represents a technical capability, allowing developers to swap out technical architecture functionality easily.
{:height="100%" width="100%"}
1. 增量更改: Developers find it easy to make some changes in this architecture, particularly if those changes are isolated to existing layers.
2. 指导变更: Developers find it easier to write fitness functions in a more structured version of a monolith because the structure of the architecture is more apparent.
3. 适当耦合: One of the virtues of monolith architectures is easy understandability. Layered architectures allow for easy evolution of the technical architecture partitions defined by the layers.
#### Modular monoliths
A modular monolith contains logical grouping of functionality with well-defined isolation between modules.
{:height="100%" width="100%"}
1. 增量更改: Incremental change is easy in this type of architecture because developers can enforce modularity.
2. 指导变更: Tests, metrics, and other fitness function mechanisms are easier to design and implement in this architecture because of good separation of components, allowing easier mocking and other testing techniques that rely on isolation layers.
3. 适当耦合: A well-designed modular monolith is a good example of appropriate coupling.
#### Microkernel
Consider another popular monolithic architectural style, the microkernal architecture, commonly found in browsers and integrated development environments (IDEs)
1. 增量更改: Once the core system is complete, most behavior should come from plug-ins, small units of deployment.
2. 指导变更: Fitness functions are typically easy to create in this architecture because of the isolation between the core and plug-ins.
3. 适当耦合: The coupling characteristics in this architecture are well defined by the microkernel pattern.
#### Event-Driven Architectures
Event-driven architectures (EDA) usually integrate several disparate systems together using message queues. There are two common implementations of this type of architecture: the broker and mediator patterns.
**Brokers**
In a broker EDA, the architectural components consist of the following elements:
* message queues : Message queues implemented via a wide variety of technologies such as JMS (Java Messaging Service).
* initiating event : The event that starts the business process.
* intra-process events : Events passed between event processors to fulfill a business process.
* event processors : The active architecture components, which perform actual business processing. When two processors need to coordinate, they pass messages via queues.
{:height="100%" width="100%"}
1. 增量更改: Broker EDAs allow incremental change in multiple forms. Building deployment pipelines for broker EDAs can be challenging because the essence of the architecture is asynchronous communication, which is notoriously difficult to test.
2. 指导变更: Atomic fitness functions should be easy for developers to write in this architecture because the individual behaviors of event processors is simple.
3. 适当耦合: Broker EDAs exhibit a low degree of coupling, enhancing the ability to make evolutionary change.
**Mediators**
The other common EDA pattern is the mediator, where an additional component appears: a hub that acts as a coordinator
{:height="100%" width="100%"}
1. 增量更改: Similar to broker EDAs, the services in a mediator EDA are typically small and self-contained. Thus, this architecture shares many of the operational advantages of the broker version.
2. 指导变更: Developers find it easier to build fitness functions for the mediator than for the broker EDA. The tests for individual event processors don’t differ much from the broker version. However, holistic fitness functions are easier to build because developers can rely on the mediator to handle coordination.
3. 适当耦合: While many testing scenarios become easier with mediators, coupling increases, harming evolution.
#### Service-Oriented Architectures
ESB-driven SOA A particular manner of creating SOAs became popular several years ago, building an architecture based around services and coordination via a service bus — typically called an Enterprise Service Bus (ESB).
1. 增量更改: While having a well-established technical service taxonomy allows for reuse and segregation of resources, it greatly hampers making the most common types of change to business domains.
2. 指导变更: Testing in general is difficult within ESB-driven SOA.
3. 适当耦合: From a potential enterprise reuse standpoint, extravagant taxonomy makes sense.
#### Microservices
* Microservices architectures partition across domain lines, embedding the technical architecture
1. 增量更改: Both aspects of incremental change are easy in microservices architectures.
2. 指导变更: Developers can easily build both atomic and holistic fitness functions for microservices architectures.
3. 适当耦合: Microservices architectures typically have two kinds of coupling: integration and service template.
#### Serverless
* “Serverless” Architectures: “Serverless” architectures are a recent shift in the software development equilibrium, with two broad meanings, both applicable to evolutionary architecture.
{:height="100%" width="100%"}
1. 增量更改: Incremental change in serverless architectures should consist of redeploying code — all the infrastructure concerns exist behind the abstraction of “serverless.”
2. 指导变更: Fitness functions are critical in this type of architecture to ensure integration points stay consistent.
3. 适当耦合: From an evolutionary architecture standpoint, FaaS is attractive because it eliminates several different dimensions from consideration: technical architecture, operational concerns, and security issues, among others.
## Chapter 5. Evolutionary Data
-------
When we refer to the DBA, we mean anyone who designs the data structures, writes code to access the data and use the data in an application, writes code that executes in the database, maintains and performance tunes the databases, and ensures proper backup and recovery procedures in the event of disaster. DBAs and developers are often the core builders of applications, and should coordinate closely.
Option 1: No integration points, no legacy data
```
ALTER TABLE customer ADD firstname VARCHAR2( 60);
ALTER TABLE customer ADD lastname VARCHAR2( 60);
ALTER TABLE customer DROP COLUMN name;
```
Option 2: Legacy data, but no integration points
```
ALTER TABLE Customer ADD firstname VARCHAR2( 60);
ALTER TABLE Customer ADD lastname VARCHAR2( 60);
UPDATE Customer set firstname = extractfirstname (name); UPDATE Customer set lastname = extractlastname (name);
ALTER TABLE customer DROP COLUMN name;
```
Option 3: Existing data and integration points
```
ALTER TABLE Customer ADD firstname VARCHAR2( 60);
ALTER TABLE Customer ADD lastname VARCHAR2( 60);
UPDATE Customer set firstname = extractfirstname (name); UPDATE Customer set lastname = extractlastname (name);
CREATE OR REPLACE TRIGGER SynchronizeName
BEFORE INSERT OR UPDATE
ON Customer
REFERENCING OLD AS OLD NEW AS NEW
FOR EACH ROW
BEGIN
IF :NEW.Name IS NULL THEN :NEW.Name := :NEW.firstname | |' '| |: NEW.lastname;
END IF; IF :NEW.name IS NOT NULL THEN
:NEW.firstname := extractfirstname(: NEW.name);
firstnam:NEW.lastname := extractlastname(: NEW.name);
END IF;
END;
```
## Chapter 6. Building Evolvable Architectures
-------
### Mechanics
Architects can operationalize these techniques for building an evolutionary architecture in three steps:
1. Identify Dimensions Affected by Evolution:First, architects must identify which dimensions of the architecture they want to protect as it evolves.
2. Define Fitness Function(s) for Each Dimension:A single dimension often contains numerous fitness functions. Then, for each dimension, they decide what parts may exhibit undesirable behavior when evolving, eventually defining fitness functions.
3. Use Deployment Pipelines to Automate Fitness Functions:Lastly, architects must encourage incremental change on the project, defining stages in a deployment pipeline to apply fitness functions and managing deployment practices like machine provisioning, testing, and other DevOps concerns.
### Retrofitting Existing Architectures
Adding evolvability to existing architectures depends on three factors: component coupling, engineering practice maturity, and developer ease in crafting fitness functions.
For the first step in migrating architecture, developers identify new service boundaries. Teams may decide to break monoliths into services via a variety of partitioning as follows:
* Business functionality groups:A business may have clear partitions that mirror IT capabilities directly.
* Transactional boundaries:Many businesses have extensive transaction boundaries they must adhere to.
* Deployment goals:Incremental change allows developers to selectively release code on different schedules.
## Chapter 7. Evolutionary Architecture Pitfalls and Antipatterns
-------
* Antipattern: Vendor King:Rather than fall victim to the vendor king antipattern, treat vendor products as just another integration point. Developers can insulate vendor tool changes from impacting their architecture by building anticorruption layers between integration points.
* A typical software stack in 2016, with lots of moving parts
{:height="100%" width="100%"}
* Microservices eschew code reuse, adopting the philosophy of prefer duplication to coupling: reuse implies coupling, and microservices architectures are extremely decoupled.
* We’re not suggesting teams avoid building reusable assets, but rather evaluate them continually to ensure they still deliver value.
## Chapter 8. Putting Evolutionary Architecture into Practice
-------
**This includes both technical and business concerns, including organization and team impacts.**
### Organizational Factors
The impact of software architecture has a surprisingly wide breadth on a variety of factors not normally associated with software, including team impacts, budgeting, and a host of others.
**Cross-Functional Teams**
these roles must change to accommodate this new structure, which includes the following roles:
1. Business Analysts:Must coordinate the goals of this service with other services, including other service teams.
2. Architecture: Design architecture to eliminate inappropriate coupling that complicates incremental change.
3. Testing: Testers must become accustomed to the challenges of integration testing across domains,
4. Operations: Slicing up services and deploying them separately (often alongside existing services and deployed continuously) is a daunting challenge for many organizations with traditional IT structures.
5. Data: Database administrators must deal with new granularity, transaction, and system of record issues.
| 65.351171 | 402 | 0.801842 | eng_Latn | 0.99526 |
61e7471a81be6265450b060a04b4b50c2c7b0b27 | 825 | md | Markdown | _pages/cortex.md | shmcgavin/brainlife.github.io | 4a6dedecff21f6d4d1564c81e6f70322e6e0960c | [
"Apache-2.0"
] | 2 | 2019-02-25T18:39:47.000Z | 2020-06-23T17:52:00.000Z | _pages/cortex.md | shmcgavin/brainlife.github.io | 4a6dedecff21f6d4d1564c81e6f70322e6e0960c | [
"Apache-2.0"
] | null | null | null | _pages/cortex.md | shmcgavin/brainlife.github.io | 4a6dedecff21f6d4d1564c81e6f70322e6e0960c | [
"Apache-2.0"
] | 5 | 2019-03-24T15:31:53.000Z | 2021-01-16T00:32:37.000Z | ---
title: "Brain Tissue Extraction"
subtitle: "Process brain images using FreeSurfer"
layout: app
starturl: "/legacy/cortex"
permalink: /home/cortex
giturl: "https://github.com/brain-life/app-freesurfer"
---
Automated access to [Compute clouds](https://jetstream-cloud.org) to process a T1-weighted anatomical image using [Freesurfer](https://surfer.nmr.mgh.harvard.edu/) and segment into multiple tissue types and brain regions. This service allows uploading a T1-weighted image and generates the following derivaties:
* Subcortical regions segmentation
* Cortical surface reconstruction
* Cortical mantle parcellation
* Cortical thickness estimation
* Skull stripping
* White matter volume identification
<br>
<h3>Sample Output</h3>
<center>
<img src="/images/screenshots/cortex.png" class="screenshot">
</center>
<br>
| 33 | 311 | 0.78303 | eng_Latn | 0.832188 |
61e839a047cdccdc35b8ce9549009c205b77e194 | 682 | md | Markdown | README.md | alekspankov/docker-php-7.0-extended | 9aba467f136f2299de7725ca98d891a8aa4e6e56 | [
"MIT"
] | null | null | null | README.md | alekspankov/docker-php-7.0-extended | 9aba467f136f2299de7725ca98d891a8aa4e6e56 | [
"MIT"
] | null | null | null | README.md | alekspankov/docker-php-7.0-extended | 9aba467f136f2299de7725ca98d891a8aa4e6e56 | [
"MIT"
] | null | null | null | # Extended PHP 7.0 / Apache
This image is extended to meet Laravel requirements.
## Getting Started
1. Pull image ```docker pull aleksxp/php7.0-apache-extended```
1. Run ```docker run -d --name some-php -p 8080:80 -v $(pwd)/src:/var/www/html aleksxp/php7.0-apache-extended```
Read more about **docker run** option in [official Docker documentstion](https://docs.docker.com/engine/reference/run/).
### Additional software
- wget
- nano
### Prerequisites
1. [Docker](https://docs.docker.com/engine/installation/)
## Authors
* **Alexander Pankov** <[email protected]>
## License
This project is licensed under the MIT License - see the [LICENSE.md](LICENSE.md) file for details | 24.357143 | 120 | 0.715543 | eng_Latn | 0.673664 |
61e8b80d7ccfc4cd4bfa2dab0a61f88a76140049 | 8,926 | md | Markdown | articles/service-bus/service-bus-amqp-overview.md | wesmc7777/azure-content | 46ff014f473bf6b195d47911186cc238f79ff826 | [
"CC-BY-3.0"
] | null | null | null | articles/service-bus/service-bus-amqp-overview.md | wesmc7777/azure-content | 46ff014f473bf6b195d47911186cc238f79ff826 | [
"CC-BY-3.0"
] | null | null | null | articles/service-bus/service-bus-amqp-overview.md | wesmc7777/azure-content | 46ff014f473bf6b195d47911186cc238f79ff826 | [
"CC-BY-3.0"
] | null | null | null | <properties
pageTitle="Service Bus AMQP overview | Microsoft Azure"
description="Learn about using the Advanced Message Queuing Protocol (AMQP) 1.0 in Azure."
services="service-bus"
documentationCenter=".net"
authors="sethmanheim"
manager="timlt"
editor="mattshel"/>
<tags
ms.service="service-bus"
ms.workload="tbd"
ms.tgt_pltfrm="na"
ms.devlang="multiple"
ms.topic="article"
ms.date="10/05/2015"
ms.author="sethm"/>
# AMQP 1.0 support in Service Bus
Both the Azure Service Bus cloud service and on-prem [Service Bus for Windows Server (Service Bus 1.1)](https://msdn.microsoft.com/library/dn282144.aspx) support the Advanced Message Queueing Protocol (AMQP) 1.0. AMQP enables you to build cross-platform, hybrid applications using an open standard protocol. You can construct applications using components that are built using different languages and frameworks, and that run on different operating systems. All these components can connect to Service Bus and seamlessly exchange structured business messages efficiently and at full fidelity.
## Introduction: What is AMQP 1.0 and why is it important?
Traditionally, message-oriented middleware products have used proprietary protocols for communication between client applications and brokers. This means that once you've selected a particular vendor's messaging broker, you must use that vendor's libraries to connect your client applications to that broker. This results in a degree of dependence on that vendor, since porting an application to a different product requires code changes in all the connected applications.
Furthermore, connecting messaging brokers from different vendors is tricky. This typically requires application-level bridging to move messages from one system to another and to translate between their proprietary message formats. This is a common requirement; for example, when you must provide a new unified interface to older disparate systems, or integrate IT systems following a merger.
The software industry is a fast-moving business; new programming languages and application frameworks are introduced at a sometimes bewildering pace. Similarly, the requirements of IT systems evolve over time and developers want to take advantage of the latest platform features. However, sometimes the selected messaging vendor does not support these platforms. Because messaging protocols are proprietary, it's not possible for others to provide libraries for these new platforms. Therefore, you must use approaches such as building gateways or bridges that enable you to continue to use the messaging product.
The development of the Advanced Message Queuing Protocol (AMQP) 1.0 was motivated by these issues. It originated at JP Morgan Chase, who, like most financial services firms, are heavy users of message-oriented middleware. The goal was simple: to create an open-standard messaging protocol that made it possible to build message-based applications using components built using different languages, frameworks, and operating systems, all using best-of-breed components from a range of suppliers.
## AMQP 1.0 technical features
AMQP 1.0 is an efficient, reliable, wire-level messaging protocol that you can use to build robust, cross-platform, messaging applications. The protocol has a simple goal: to define the mechanics of the secure, reliable, and efficient transfer of messages between two parties. The messages themselves are encoded using a portable data representation that enables heterogeneous senders and receivers to exchange structured business messages at full fidelity. The following is a summary of the most important features:
* **Efficient**: AMQP 1.0 is a connection-oriented protocol that uses a binary encoding for the protocol instructions and the business messages transferred over it. It incorporates sophisticated flow-control schemes to maximize the utilization of the network and the connected components. That said, the protocol was designed to strike a balance between efficiency, flexibility and interoperability.
* **Reliable**: The AMQP 1.0 protocol allows messages to be exchanged with a range of reliability guarantees, from fire-and-forget to reliable, exactly-once acknowledged delivery.
* **Flexible**: AMQP 1.0 is a flexible protocol that can be used to support different topologies. The same protocol can be used for client-to-client, client-to-broker, and broker-to-broker communications.
* **Broker-model independent**: The AMQP 1.0 specification does not make any requirements on the messaging model used by a broker. This means that it's possible to easily add AMQP 1.0 support to existing messaging brokers.
## AMQP 1.0 is a Standard (with a capital 'S')
AMQP 1.0 has been in development since 2008 by a core group of more than 20 companies, both technology suppliers and end-user firms. During that time, user firms have contributed their real-world business requirements and the technology vendors have evolved the protocol to meet those requirements. Throughout the process, vendors have participated in workshops in which they collaborated to validate the interoperability between their implementations.
In October 2011, the development work transitioned to a technical committee within the Organization for the Advancement of Structured Information Standards (OASIS) and the OASIS AMQP 1.0 Standard was released in October 2012. The following firms participated in the technical committee during the development of the standard:
* **Technology vendors**: Axway Software, Huawei Technologies, IIT Software, INETCO Systems, Kaazing, Microsoft, Mitre Corporation, Primeton Technologies, Progress Software, Red Hat, SITA, Software AG, Solace Systems, VMware, WSO2, Zenika.
* **User firms**: Bank of America, Credit Suisse, Deutsche Boerse, Goldman Sachs, JPMorgan Chase.
Some of the commonly cited benefits of open standards include:
* Less chance of vendor lock-in
* Interoperability
* Broad availability of libraries and tooling
* Protection against obsolescence
* Availability of knowledgeable staff
* Lower and manageable risk
## AMQP 1.0 and Service Bus
AMQP 1.0 support in Azure Service Bus means that you can now leverage the Service Bus queuing and publish/subscribe brokered messaging features from a range of platforms using an efficient binary protocol. Furthermore, you can build applications comprised of components built using a mix of languages, frameworks, and operating systems.
The following figure illustrates an example deployment in which Java clients running on Linux, written using the standard Java Message Service (JMS) API and .NET clients running on Windows, exchange messages via Service Bus using AMQP 1.0.
![][0]
**Figure 1: Example deployment scenario showing cross-platform messaging using Service Bus and AMQP 1.0**
At this time the following client libraries are known to work with Service Bus:
| Language | Library |
|----------|-------------------------------------------------------------------------------|
| Java | Apache Qpid Java Message Service (JMS) client<br/>IIT Software SwiftMQ Java client |
| C | Apache Qpid Proton-C |
| PHP | Apache Qpid Proton-PHP |
| Python | Apache Qpid Proton-Python |
| C# | AMQP .Net Lite |
**Figure 2: Table of AMQP 1.0 client libraries**
## Summary
* AMQP 1.0 is an open, reliable messaging protocol that you can use to build cross-platform, hybrid applications. AMQP 1.0 is an OASIS standard.
* AMQP 1.0 support is now available in Azure Service Bus as well as Service Bus for Windows Server (Service Bus 1.1). Pricing is the same as for the existing protocols.
## Next steps
Ready to learn more? Visit the following links:
- [Using Service Bus from .NET with AMQP]
- [Using Service Bus from Java with AMQP]
- [Using Service Bus from Python with AMQP]
- [Using Service Bus from PHP with AMQP]
- [Installing Apache Qpid Proton-C on an Azure Linux VM]
- [AMQP in Service Bus for Windows Server]
[0]: ./media/service-bus-amqp-overview/service-bus-amqp-1.png
[Using Service Bus from .NET with AMQP]: service-bus-amqp-dotnet.md
[Using Service Bus from Java with AMQP]: service-bus-amqp-java.md
[Using Service Bus from Python with AMQP]: service-bus-amqp-python.md
[Using Service Bus from PHP with AMQP]: service-bus-amqp-php.md
[Installing Apache Qpid Proton-C on an Azure Linux VM]: service-bus-amqp-apache.md
[AMQP in Service Bus for Windows Server]: https://msdn.microsoft.com/library/dn574799.aspx | 84.207547 | 613 | 0.747367 | eng_Latn | 0.996261 |
61e8d00b04a3a98a42fa1b969c05af676a8d48d7 | 470 | md | Markdown | .github/ISSUE_TEMPLATE/question-help.md | HimashiRathnayake/adapter-transformers | d9c06ecbf4aaa33756e848b8fc5b3ec65f5ff4f4 | [
"Apache-2.0"
] | 723 | 2020-07-16T13:02:25.000Z | 2022-03-31T21:03:55.000Z | .github/ISSUE_TEMPLATE/question-help.md | HimashiRathnayake/adapter-transformers | d9c06ecbf4aaa33756e848b8fc5b3ec65f5ff4f4 | [
"Apache-2.0"
] | 170 | 2020-07-16T14:39:11.000Z | 2022-03-31T13:02:11.000Z | .github/ISSUE_TEMPLATE/question-help.md | HimashiRathnayake/adapter-transformers | d9c06ecbf4aaa33756e848b8fc5b3ec65f5ff4f4 | [
"Apache-2.0"
] | 131 | 2020-07-16T14:38:16.000Z | 2022-03-29T19:43:18.000Z | ---
name: "❓ Questions & Help"
about: Submit a question on working with adapter-transformers or a request for help
title: ''
labels: 'question'
assignees: ''
---
## Environment info
<!-- You can run the command `transformers-cli env` and copy-and-paste its output below.
Remove if your question/ request is not technical. -->
- `adapter-transformers` version:
- Platform:
- Python version:
- PyTorch version (GPU?):
## Details
<!-- Additional details -->
| 21.363636 | 88 | 0.689362 | eng_Latn | 0.980029 |
61e8df32d8a2ca67efb8ea463c1f166fb5cc8290 | 37 | md | Markdown | README.md | dorokei/dorobot | a9f108aa7591038cf64a1b49e7589d6e743a2f01 | [
"MIT"
] | 2 | 2015-12-05T00:59:35.000Z | 2018-07-02T15:50:12.000Z | README.md | dorokei/dorobot | a9f108aa7591038cf64a1b49e7589d6e743a2f01 | [
"MIT"
] | null | null | null | README.md | dorokei/dorobot | a9f108aa7591038cf64a1b49e7589d6e743a2f01 | [
"MIT"
] | null | null | null | # dorobot
Robot operated by browser
| 12.333333 | 26 | 0.783784 | eng_Latn | 0.992931 |
61ea0129286b253b3f281342c32ed50df36dfe00 | 2,744 | md | Markdown | content/drone-plugins/drone-buildah/index.md | jamie-harness/drone-plugin-index | 1ba83ec4941dbf926a619b46c7cd43b173a9add3 | [
"MIT"
] | null | null | null | content/drone-plugins/drone-buildah/index.md | jamie-harness/drone-plugin-index | 1ba83ec4941dbf926a619b46c7cd43b173a9add3 | [
"MIT"
] | null | null | null | content/drone-plugins/drone-buildah/index.md | jamie-harness/drone-plugin-index | 1ba83ec4941dbf926a619b46c7cd43b173a9add3 | [
"MIT"
] | null | null | null | ---
date: 2021-11-07T00:00:00+00:00
title: Buildah
author: drone-plugins
tags: [buildah, cache, amazon, aws, s3]
logo: buildah.svg
repo: drone-plugins/drone-buildah
image: plugins/buildah-docker
---
The buildah plugin allows you to build images without privileges. The below pipeline configuration demonstrates simple usage
:
```yaml
kind: pipeline
name: default
steps
: - name: publish
image: plugins/buildah
settings
: repo: docker.io/harness/ci-automation
registry: docker.io
password: <username>
username: <password>
dockerfile: Dockerfile
tags: buildahoutput
layers: true
s3_local_cache_dir: ./test
s3_bucket: <s3_bucket_name>
s3_region: us-east-1
s3_key: <s3_access_key>
s3_secret: <s3_secret>
s3_endpoint: s3.amazonaws.com
```
# Parameter Reference
All optionals, most pf the flags are similar to what docker have, also check https://github.com/containers/buildah/blob/main/README.md:
dry-run
: dry run disables docker push
remote.url
: git remote url
commit.sha
: git commit sha
commit.ref
: git commit ref
dockerfile:
dockerfile used to build, default "Dockerfile"
context
: build context
tags
: tag used to tage built image, default "latest
tags.auto
: default build tags,
tags.suffix
: default build tags with suffix
args
: build args,
args-from-env
: build args
quiet
: quiet docker build
target
: build target
squash
: squash the layers at build time
pull-image
: force pull base image at build time
compress
: compress the build context using gzip
repo
: docker repository used to push image
custom-labels
: additional k=v labels
label-schema
: label-schema labels
auto-label
: auto-label true|false
link
: link, for example https://example.com/org/repo-name
docker.registry
: docker registry used tp push image, default "https://index.docker.io/v1/"
docker.username
: docker username
docker.password
: docker password
docker.email
: docker email
docker.config
: docker json dockerconfig content
docker.purge
: docker should cleanup images
repo.branch
: repository default branch
no-cache
: do not use cached intermediate containers
add-host
: additional host:IP mapping
layers
: Use layered caching
s3-local-cache-dir
: local directory for saving S3 based cache, only usable when layers is set to true
s3-bucket
: S3 bucket name, only usable when layers is set to true
s3-endpoint
: S3 endpoint address, only usable when layers is set to true
s3-region
: S3 region, only usable when layers is set to true
s3-key
: S3 access key, only usable when layers is set to true
s3-secret
: S3 access secret, only usable when layers is set to true
s3-use-ssl
: Enable SSL for S3 connections, only usable when layers is set to true
| 17.703226 | 136 | 0.750364 | eng_Latn | 0.944561 |
61eaaca2acf9faf1b4fbec05268ef1307ecb469e | 1,825 | md | Markdown | docs/framework/unmanaged-api/debugging/icordebugmodule-getfunctionfromtoken-method.md | adamsitnik/docs.pl-pl | c83da3ae45af087f6611635c348088ba35234d49 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/unmanaged-api/debugging/icordebugmodule-getfunctionfromtoken-method.md | adamsitnik/docs.pl-pl | c83da3ae45af087f6611635c348088ba35234d49 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/unmanaged-api/debugging/icordebugmodule-getfunctionfromtoken-method.md | adamsitnik/docs.pl-pl | c83da3ae45af087f6611635c348088ba35234d49 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: ICorDebugModule::GetFunctionFromToken — Metoda
ms.date: 03/30/2017
api_name:
- ICorDebugModule.GetFunctionFromToken
api_location:
- mscordbi.dll
api_type:
- COM
f1_keywords:
- ICorDebugModule::GetFunctionFromToken
helpviewer_keywords:
- GetFunctionFromToken method, ICorDebugModule interface [.NET Framework debugging]
- ICorDebugModule::GetFunctionFromToken method [.NET Framework debugging]
ms.assetid: 6fe12194-4ef7-43c1-9570-ade35ccf127a
topic_type:
- apiref
ms.openlocfilehash: cb966a918c63b4fbc00dcf52819b9384427dfdaa
ms.sourcegitcommit: 559fcfbe4871636494870a8b716bf7325df34ac5
ms.translationtype: MT
ms.contentlocale: pl-PL
ms.lasthandoff: 10/30/2019
ms.locfileid: "73129586"
---
# <a name="icordebugmodulegetfunctionfromtoken-method"></a>ICorDebugModule::GetFunctionFromToken — Metoda
Pobiera funkcję określoną przez token metadanych.
## <a name="syntax"></a>Składnia
```cpp
HRESULT GetFunctionFromToken(
[in] mdMethodDef methodDef,
[out] ICorDebugFunction **ppFunction
);
```
## <a name="parameters"></a>Parametry
`methodDef`
podczas `mdMethodDef` token metadanych, który odwołuje się do metadanych funkcji.
`ppFunction`
określoną Wskaźnik do adresu obiektu interfejsu ICorDebugFunction, który reprezentuje funkcję.
## <a name="remarks"></a>Uwagi
Metoda `GetFunctionFromToken` zwraca wynik HRESULT CORDBG_E_FUNCTION_NOT_IL, jeśli wartość przeniesiona `methodDef` nie odwołuje się do metody MSIL.
## <a name="requirements"></a>Wymagania
**Platformy:** Zobacz [wymagania systemowe](../../../../docs/framework/get-started/system-requirements.md).
**Nagłówek:** CorDebug. idl, CorDebug. h
**Biblioteka:** CorGuids. lib
**Wersje .NET Framework:** [!INCLUDE[net_current_v10plus](../../../../includes/net-current-v10plus-md.md)]
| 33.181818 | 151 | 0.75726 | yue_Hant | 0.404884 |
61eab4bb7faa19e788eab90c1e72bc18c7769c43 | 5,411 | md | Markdown | articles/cosmos-db/cassandra-introduction.md | peder-andfrankly/azure-docs.sv-se | 49435a06686fc72ca9cd8c83883c3c3704a6ec72 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/cosmos-db/cassandra-introduction.md | peder-andfrankly/azure-docs.sv-se | 49435a06686fc72ca9cd8c83883c3c3704a6ec72 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/cosmos-db/cassandra-introduction.md | peder-andfrankly/azure-docs.sv-se | 49435a06686fc72ca9cd8c83883c3c3704a6ec72 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Introduktion till Cassandra-API:et för Azure Cosmos DB
description: Läs hur du kan använda Azure Cosmos DB för att lyfta och skifta befintliga program och skapa nya program med Cassandra-API:t med hjälp av de Cassandra-drivrutiner och CQL som du redan är bekant med.
author: kanshiG
ms.service: cosmos-db
ms.subservice: cosmosdb-cassandra
ms.topic: overview
ms.date: 05/21/2019
ms.author: govindk
ms.reviewer: sngun
ms.openlocfilehash: 82ca7814f756a12005ee5802c3e8a7fd28f6d398
ms.sourcegitcommit: e9a46b4d22113655181a3e219d16397367e8492d
ms.translationtype: MT
ms.contentlocale: sv-SE
ms.lasthandoff: 05/21/2019
ms.locfileid: "65968968"
---
# <a name="introduction-to-the-azure-cosmos-db-cassandra-api"></a>Introduktion till Cassandra-API:et för Azure Cosmos DB
Cassandra-API:et för Azure Cosmos DB kan användas som datalager för appar som är skrivna för [Apache Cassandra](https://cassandra.apache.org/). Det betyder att ditt befintliga Cassandra-program, genom att använda befintliga [Apache-drivrutiner](https://cassandra.apache.org/doc/latest/getting_started/drivers.html?highlight=driver) som är kompatibla med CQLv4, nu kan kommunicera med Cassandra-API:et för Azure Cosmos DB. Ofta kan du växla från att använda Apache Cassandra till att använda Cassandra-API:et för Azure Cosmos DB genom att bara ändra en anslutningssträng.
Cassandra-API:t låter dig interagera med data som lagras i Azure Cosmos DB med hjälp av Cassandra Query Language (CQL), Cassandra-baserade verktyg (till exempel cqlsh) och de Cassandra-klientdrivrutiner som du redan är bekant med.
## <a name="what-is-the-benefit-of-using-apache-cassandra-api-for-azure-cosmos-db"></a>Vad är fördelen med att använda Apache Cassandra API:t för Azure Cosmos DB?
**Hantering utan åtgärder**: Som en fullständigt hanterad molntjänst eliminerar Cassandra-API:et för Azure Cosmos DB det tidskrävande arbetet med att hantera och övervaka en mängd olika inställningar i operativsystem, JVM och yaml-filer samt deras interaktioner. Azure Cosmos DB erbjuder övervakning av dataflödet, svarstider, lagring, tillgänglighet och konfigurerbara aviseringar.
**Prestandahantering**: Azure Cosmos DB tillhandahåller läsningar och skrivningar med garanterat korta svarstider inom den 99:e percentilen, backade av serviceavtalen. Användarna behöver inte bekymra sig om driftsomkostnader för att säkerställa höga prestanda och snabba läsningar och skrivningar. Det innebär att användarna inte behöver lösa schemaläggningsproblem, hantera tombstones eller skapa bloomfilter och repliker manuellt. Med Azure Cosmos DB slipper du ägna tid åt dessa problem så att du kan fokusera på programlogiken.
**Möjlighet att använda befintlig kod och verktyg**: Azure Cosmos DB erbjuder kompatibilitet på trådprotokollsnivå med befintliga SDK:er och verktyg för Cassandra. Den här kompatibiliteten garanterar att du kan använda din befintliga kodbas med Cassandra-API:et för Azure Cosmos DB med minimala ändringar.
**Dataflöde och lagringselasticitet**: Azure Cosmos DB tillhandahåller garanterad genomströmning i alla regioner och kan skala det etablerade dataflödet via Azure-portalen, PowerShell eller CLI. Du kan skala lagringsutrymmet och dataflödet elastiskt för dina tabeller efter behov med förutsägbara prestanda.
**Global distribution och tillgänglighet**: Med Azure Cosmos DB kan du distribuera data över alla Azure-regioner globalt och tillhandahålla data lokalt med snabb dataåtkomst och hög tillgänglighet. Azure Cosmos DB erbjuder 99,99 % hög tillgänglighet inom en region och 99,999 % läs- och skrivtillgänglighet i flera regioner utan ytterligare driftkostnader. Läs mer i artikeln [Distribuera data globalt](distribute-data-globally.md).
**Val av konsekvensnivåer:** Med Azure Cosmos DB kan du välja mellan fem väldefinierade konsekvensnivåer för bästa möjliga balans mellan konsekvens och prestanda. Dessa konsekvensnivåer är stark, begränsat föråldrad, session, konsekvent prefix och eventuell. Dessa väldefinierade, praktiska och intuitiva konsekvensnivåer hjälper utvecklare att göra exakta avvägningarna mellan konsekvens, tillgänglighet och svarstider. Läs mer i artikeln om [konsekvensnivåer](consistency-levels.md).
**Enterprise-klass**: Azure Cosmos DB erbjuder [kompatibilitetscertifieringar](https://www.microsoft.com/trustcenter) så att användare kan använda plattformen på ett säkert sätt. Azure Cosmos DB erbjuder även kryptering i vila och i rörelse, IP-brandvägg och granskningsloggar för kontrollplansaktiviteter.
## <a name="next-steps"></a>Nästa steg
* Du kan snabbt komma igång med att utveckla följande språkspecifika appar för att skapa och hantera data för Cassandra-API:et:
- [Node.js-app](create-cassandra-nodejs.md)
- [.NET-app](create-cassandra-dotnet.md)
- [Python-app](create-cassandra-python.md)
* Kom igång med att [skapa ett konto, en databas och en tabell för Cassandra-API:et](create-cassandra-api-account-java.md) med hjälp av ett Java-program.
* [Läs in exempeldata i tabellen för Cassandra-API:et](cassandra-api-load-data.md) med hjälp av ett Java-program.
* [Fråga efter data från kontot för Cassandra-API:et](cassandra-api-query-data.md) med hjälp av ett Java-program.
* Mer information om Apache Cassandra-funktioner som stöds av Cassandra-API:et för Azure Cosmos DB finns i artikeln om [Cassandra-stöd](cassandra-support.md).
* Läs våra [vanliga frågor och svar](faq.md#cassandra).
| 96.625 | 571 | 0.813158 | swe_Latn | 0.999732 |
61eb50e6a7e5be81f8121051acc6d582d1c4aa88 | 1,332 | md | Markdown | _containers/brucemoran-Singularity-gridss-purple-linx.1.2.0.docker.md | singularityhub/singularityhub-archive | dd1d471db0c4ac01998cb84bd1ab82e97c8dab65 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | _containers/brucemoran-Singularity-gridss-purple-linx.1.2.0.docker.md | singularityhub/singularityhub-archive | dd1d471db0c4ac01998cb84bd1ab82e97c8dab65 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | _containers/brucemoran-Singularity-gridss-purple-linx.1.2.0.docker.md | singularityhub/singularityhub-archive | dd1d471db0c4ac01998cb84bd1ab82e97c8dab65 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | ---
id: 13400
name: "brucemoran/Singularity"
branch: "master"
tag: "gridss-purple-linx.1.2.0.docker"
commit: "d7eacac0e447973156f0e31296ebd99c11a2132a"
version: "db0f1b5459dc516c4a4a913b1511d375"
build_date: "2021-02-23T14:18:14.751Z"
size_mb: 2360.0
size: 941240351
sif: "https://datasets.datalad.org/shub/brucemoran/Singularity/gridss-purple-linx.1.2.0.docker/2021-02-23-d7eacac0-db0f1b54/db0f1b5459dc516c4a4a913b1511d375.sif"
datalad_url: https://datasets.datalad.org?dir=/shub/brucemoran/Singularity/gridss-purple-linx.1.2.0.docker/2021-02-23-d7eacac0-db0f1b54/
recipe: https://datasets.datalad.org/shub/brucemoran/Singularity/gridss-purple-linx.1.2.0.docker/2021-02-23-d7eacac0-db0f1b54/Singularity
collection: brucemoran/Singularity
---
# brucemoran/Singularity:gridss-purple-linx.1.2.0.docker
```bash
$ singularity pull shub://brucemoran/Singularity:gridss-purple-linx.1.2.0.docker
```
## Singularity Recipe
```singularity
<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
<html><head>
<title>404 Not Found</title>
</head><body>
<h1>Not Found</h1>
<p>The requested URL was not found on this server.</p>
<hr>
<address>Apache/2.4.38 (Debian) Server at datasets.datalad.org Port 443</address>
</body></html>
```
## Collection
- Name: [brucemoran/Singularity](https://github.com/brucemoran/Singularity)
- License: None
| 31.714286 | 161 | 0.765766 | yue_Hant | 0.231974 |
61eb6637bc558b723789b6fa0574e78767db1c93 | 3,068 | md | Markdown | README.md | TarikFerdaoussi/gatsby-all-in | d1b706c82ecaf76cd902471c57a6668ccc8eb013 | [
"MIT"
] | null | null | null | README.md | TarikFerdaoussi/gatsby-all-in | d1b706c82ecaf76cd902471c57a6668ccc8eb013 | [
"MIT"
] | null | null | null | README.md | TarikFerdaoussi/gatsby-all-in | d1b706c82ecaf76cd902471c57a6668ccc8eb013 | [
"MIT"
] | null | null | null | <div align="center">
<img width="200" height="200"
src="https://raw.githubusercontent.com/Gherciu/gatsby-all-in/master/static/logo.png">
<h1>gatsby-all-in</h1>
<p> 🗃⚛️A GatsbyJs starter that includes the most popular js libraries, already pre-configured and ready for use. <a href="https://gatsby-all-in.netlify.com/" alt="gatsby-all-in">DEMO</a>.</p>
</div>
[](https://app.netlify.com/sites/gatsby-all-in/deploys)
[](https://github.com/Gherciu/gatsby-all-in/blob/master/LICENSE.md)
[](https://github.com/Gherciu/gatsby-all-in)
### Getting started
- Create a new Gatsby site using the gatsby-all-in starter: `gatsby new blog https://github.com/Gherciu/gatsby-all-in`
- Edit configuration variables in `.env.development` file
- Start dev server: `npm run start`
### Features
- `ESLint` and `Stylelint` to enforce code style. Run `npm run lint:scripts` for `.js|.jsx` and `npm run lint:styles` for `.css|.scss` files.
- Pre-commit hooks with `husky` and `lint-staged`
- Useful SCSS helpers `_mixins` and `_vars` see all in `./src/styles`
- `redux` and `redux-devtools` implimented and configured to work well in `development` mode and `production`. The store is hot reloadable ;)
- Aliases for all folders (components, styles, store etc.) see all available aliases in `./gatsby-config.js`
- `antd` is added and configured to work well as an UI framework (css normalization is not need, antd has own)
- All folders in `./src` have own README.md file with a little documentation and usage guide
- `Helmet` implimented and configured with `gatsby-plugin-react-helmet` see an example in `./src/layouts/MainLayout.js`
- Configured `tailwindcss` a utility-first CSS framework for rapidly building custom designs.
### When ready to build for production
- Create file `.env.production` the content should be the same as in `.env.development`
- Build the project: `npm run build`
- Start production server: `npm run serve`
---
## Contributing
1. Fork it!
2. Create your feature branch: `git checkout -b my-new-feature`
3. Commit your changes: `git commit -am 'Add some feature'`
4. Push to the branch: `git push origin my-new-feature`
5. Submit a pull request :D
## Author
**[@Gherciu/gatsby-all-in](https://github.com/Gherciu/gatsby-all-in)** © [GHERCIU](https://github.com/Gherciu), Released under the [MIT](https://github.com/Gherciu/gatsby-all-in/blob/master/LICENSE.md) License.<br>
Authored and maintained by GHERCIU with help from contributors ([list](https://github.com/Gherciu/gatsby-all-in/contributors)).
#### If you like this repository star⭐ and watch👀 on [GitHub](https://github.com/Gherciu/gatsby-all-in)
## 💫 Deploy
[](https://app.netlify.com/start/deploy?repository=https://github.com/Gherciu/gatsby-all-in)
| 54.785714 | 214 | 0.736962 | eng_Latn | 0.718112 |
61ebb38ed45e94963cdaaa2f3f2537e1aad4e5dc | 440 | md | Markdown | applications/bfp/README.md | ActorForth/bch-js-example | 2b99b21210acedd607b14567d2f97ee6d3382ca3 | [
"BSD-3-Clause"
] | 1 | 2020-10-04T06:53:59.000Z | 2020-10-04T06:53:59.000Z | applications/bfp/README.md | ActorForth/bch-js-example | 2b99b21210acedd607b14567d2f97ee6d3382ca3 | [
"BSD-3-Clause"
] | null | null | null | applications/bfp/README.md | ActorForth/bch-js-example | 2b99b21210acedd607b14567d2f97ee6d3382ca3 | [
"BSD-3-Clause"
] | null | null | null | # Bitcoin Files Protocol (BFP)
This folder contains examples for using the [Bitcoin Files Protocol](https://github.com/simpleledger/slp-specifications/blob/master/bitcoinfiles.md) to write data to the BCH blockchain. This allows files of up to 10KB to be written on-chain. It makes use of the [bitcoinfiles-node](https://www.npmjs.com/package/bitcoinfiles-node) npm package that was created by James Cramer and then modified by Vin Armani.
| 146.666667 | 408 | 0.8 | eng_Latn | 0.986123 |
61ec7ec905840147ff95cead3699520df52a4eb4 | 24 | md | Markdown | README.md | hardog/BeingMarry | 5fe6f8447afa7cc2a441d523b8f9154e225ef870 | [
"MIT"
] | null | null | null | README.md | hardog/BeingMarry | 5fe6f8447afa7cc2a441d523b8f9154e225ef870 | [
"MIT"
] | null | null | null | README.md | hardog/BeingMarry | 5fe6f8447afa7cc2a441d523b8f9154e225ef870 | [
"MIT"
] | null | null | null | # THEME 70
For My Love! | 8 | 12 | 0.666667 | kor_Hang | 0.415413 |
61ec8a17d6e4b46c65fc48fa580404341fbaf01c | 655 | md | Markdown | README.md | bobabots253/Code2022 | 8fe60010dd27c94d4da8dc3a193aa92bcd1f581e | [
"BSD-3-Clause"
] | null | null | null | README.md | bobabots253/Code2022 | 8fe60010dd27c94d4da8dc3a193aa92bcd1f581e | [
"BSD-3-Clause"
] | null | null | null | README.md | bobabots253/Code2022 | 8fe60010dd27c94d4da8dc3a193aa92bcd1f581e | [
"BSD-3-Clause"
] | 1 | 2022-03-26T19:13:54.000Z | 2022-03-26T19:13:54.000Z | <a href="http://millsroboticsteam253.com/"><img src="https://img.shields.io/badge/BobaBots-253-blue"></img></a>
[](https://travis-ci.org/MillsRoboticsTeam253/Code2020)
# Mills Boba Bots 253 2022 Code
- [x] 2022 WPILib Libraries
# Branches
- master: written for use in competition
- subsystem_testing: written to easily test that each individual subsystem is online and functioning as expected on a base level
- sfr: adapted from the master branch for modification during San Francisco Regional
## About
This robotic code is made for the ***Rapid React*** game. <br>
| 50.384615 | 141 | 0.766412 | eng_Latn | 0.769352 |
61ecd3287847bb7b6dca3d27873d5d6c61c88c3b | 2,062 | md | Markdown | translations/pt-BR/content/packages/learn-github-packages/installing-a-package.md | Varshans2/docs | b9dc0417694f59b81f2f597bf72bdf116d05f88a | [
"CC-BY-4.0",
"MIT"
] | 4 | 2021-10-30T03:55:11.000Z | 2021-11-01T16:58:36.000Z | translations/pt-BR/content/packages/learn-github-packages/installing-a-package.md | Varshans2/docs | b9dc0417694f59b81f2f597bf72bdf116d05f88a | [
"CC-BY-4.0",
"MIT"
] | 144 | 2021-10-21T04:41:09.000Z | 2022-03-30T09:55:16.000Z | translations/pt-BR/content/packages/learn-github-packages/installing-a-package.md | Varshans2/docs | b9dc0417694f59b81f2f597bf72bdf116d05f88a | [
"CC-BY-4.0",
"MIT"
] | 3 | 2022-03-08T03:54:20.000Z | 2022-03-15T06:28:15.000Z | ---
title: Instalar um pacote
intro: 'Você pode instalar um pacote do {% data variables.product.prodname_registry %} e usá-lo como uma dependência no seu próprio projeto.'
product: '{% data reusables.gated-features.packages %}'
redirect_from:
- /github/managing-packages-with-github-packages/installing-a-package
- /packages/publishing-and-managing-packages/installing-a-package
- /packages/manage-packages/installing-a-package
permissions: You can install any package that you have permission to view.
versions:
fpt: '*'
ghes: '*'
ghae: '*'
---
{% data reusables.package_registry.packages-ghes-release-stage %}
{% data reusables.package_registry.packages-ghae-release-stage %}
## Sobre a instalação do pacote
Você pode pesquisar {% data variables.product.product_name %} para encontrar pacotes no {% data variables.product.prodname_registry %} que você pode instalar no seu próprio projeto. Para obter mais informações, consulte "[Pesquisar pacotes no {% data variables.product.prodname_registry %}](/search-github/searching-on-github/searching-for-packages)".
Depois de encontrar um pacote, você pode ler a descrição e as instruções de instalação e utilização na página de pacotes.
## Instalar um pacote
Você pode instalar um pacote de {% data variables.product.prodname_registry %} usando qualquer {% ifversion fpt or ghae %}tipo de pacote cliente compatível{% else %}pacote habilitado para sua instância{% endif %}, seguindo as mesmas diretrizes gerais.
1. Efetue a autenticação com {% data variables.product.prodname_registry %} usando as instruções para seu cliente de pacote. Para obter mais informações, consulte "[Efetuar a autenticação no GitHub Packages](/packages/learn-github-packages/introduction-to-github-packages#authenticating-to-github-packages)".
2. Instale o pacote usando as instruções para seu cliente de pacote.
Para obter instruções específicas para o seu cliente de pacotes, consulte "[Trabalhar com um registro de {% data variables.product.prodname_registry %}](/packages/working-with-a-github-packages-registry)".
| 62.484848 | 351 | 0.783705 | por_Latn | 0.971558 |
61ed2f9e8df515fbee6db1f5e35da7bc6dd84231 | 969 | md | Markdown | _posts/artists/2015-8-22-pipblom.md | bignotyet/bignotyet.github.io | 4236627d6542cd4682a55cb4449896c665feafad | [
"CC-BY-3.0"
] | null | null | null | _posts/artists/2015-8-22-pipblom.md | bignotyet/bignotyet.github.io | 4236627d6542cd4682a55cb4449896c665feafad | [
"CC-BY-3.0"
] | null | null | null | _posts/artists/2015-8-22-pipblom.md | bignotyet/bignotyet.github.io | 4236627d6542cd4682a55cb4449896c665feafad | [
"CC-BY-3.0"
] | null | null | null | ---
layout: artist
title: Pip Blom
description: Grungy truth
description2: Dutch
categories: artists
spotify_url: https://open.spotify.com/artist/6zWJfH1TTmIqEi7EV35HGr?si=0md2arvDQ-K6HQ1TnsQcPA
facebook_url: https://www.facebook.com/pipblommusic/
instagram_url: https://www.instagram.com/pipblom/
twitter_url: https://twitter.com/pipblom/
youtubeId1: yWF3TsUYx6Y
youtubeId2: 8gp4-W1Ok90
image: assets/images/pipblom.jpg
---
**Pip** started writing on her own until she found herself in need of a band that could play her songs live. Her brother (and two other musicians) joined in and Pip Blom became a music project. Their frontwoman may look shy but she is actually confident about the kind of music she wants to play and how to play it - she still writes every part and sends it to her bandmates so that they can readjust. With no intention to release an album yet, the lo-fi guitars and soft vocals in their EP *Paycheck* (2018) will keep you pleased for now. | 51 | 540 | 0.790506 | eng_Latn | 0.994255 |
61ed479d1b562bd6bc3a6851ef7a557f60c983f5 | 2,234 | md | Markdown | test-results/sqlite-rdb2rdf/R2RMLTC0015a.md | bennokr/rdflib-r2r | ac6f3d31cff7c16210f62c65c71d858ba711d2ad | [
"MIT"
] | null | null | null | test-results/sqlite-rdb2rdf/R2RMLTC0015a.md | bennokr/rdflib-r2r | ac6f3d31cff7c16210f62c65c71d858ba711d2ad | [
"MIT"
] | null | null | null | test-results/sqlite-rdb2rdf/R2RMLTC0015a.md | bennokr/rdflib-r2r | ac6f3d31cff7c16210f62c65c71d858ba711d2ad | [
"MIT"
] | null | null | null | # R2RMLTC0015a
[link](https://www.w3.org/TR/rdb2rdf-test-cases/#R2RMLTC0015a)
Generation of language tags from a table with language information
## Created SQL query
```sql
SELECT anon_1.s AS s,
anon_1.p AS p,
anon_1.o AS o
FROM
(SELECT CAST('<' AS VARCHAR) || CAST('http://example.com/' AS VARCHAR) || replace(replace(replace(replace(replace(replace(CAST("View_NB2HI4B2F4XWK6DBNVYGYZJOMNXW2L3CMFZWKL2UOJUXA3DFONGWC4BS"."Code" AS VARCHAR), ' ', '%20'), '/', '%2F'), '(', '%28'), ')', '%29'), ',', '%2C'), ':', '%3A') || CAST('>' AS VARCHAR) AS s,
'<http://www.w3.org/2000/01/rdf-schema#label>' AS p,
CAST('"' AS VARCHAR) || CAST(CAST("View_NB2HI4B2F4XWK6DBNVYGYZJOMNXW2L3CMFZWKL2UOJUXA3DFONGWC4BS"."Name" AS VARCHAR) AS VARCHAR) || CAST('"@es' AS VARCHAR) AS o,
NULL AS g
FROM
(SELECT "Code",
"Name",
"Lan"
FROM "Country"
WHERE "Lan" = 'ES') AS "View_NB2HI4B2F4XWK6DBNVYGYZJOMNXW2L3CMFZWKL2UOJUXA3DFONGWC4BS"
UNION ALL SELECT CAST('<' AS VARCHAR) || CAST('http://example.com/' AS VARCHAR) || replace(replace(replace(replace(replace(replace(CAST("View_NB2HI4B2F4XWK6DBNVYGYZJOMNXW2L3CMFZWKL2UOJUXA3DFONGWC4BR"."Code" AS VARCHAR), ' ', '%20'), '/', '%2F'), '(', '%28'), ')', '%29'), ',', '%2C'), ':', '%3A') || CAST('>' AS VARCHAR) AS s,
'<http://www.w3.org/2000/01/rdf-schema#label>' AS p,
CAST('"' AS VARCHAR) || CAST(CAST("View_NB2HI4B2F4XWK6DBNVYGYZJOMNXW2L3CMFZWKL2UOJUXA3DFONGWC4BR"."Name" AS VARCHAR) AS VARCHAR) || CAST('"@en' AS VARCHAR) AS o,
NULL AS g
FROM
(SELECT "Code",
"Name",
"Lan"
FROM "Country"
WHERE "Lan" = 'EN') AS "View_NB2HI4B2F4XWK6DBNVYGYZJOMNXW2L3CMFZWKL2UOJUXA3DFONGWC4BR") AS anon_1
```
## Triple Diff
```diff
<http://example.com/BO> <http://www.w3.org/2000/01/rdf-schema#label> "Bolivia, Plurinational State of"@en .
<http://example.com/BO> <http://www.w3.org/2000/01/rdf-schema#label> "Estado Plurinacional de Bolivia"@es .
<http://example.com/IE> <http://www.w3.org/2000/01/rdf-schema#label> "Ireland"@en .
<http://example.com/IE> <http://www.w3.org/2000/01/rdf-schema#label> "Irlanda"@es .
```
SUCCES | 54.487805 | 329 | 0.628917 | yue_Hant | 0.983259 |
61ed96b8dc45dd0c12504ecfdfece8a709a5a61d | 5,490 | md | Markdown | _pages/splash-page.md | gauravdiwan89/gauravdiwan89.github.io | 5f3bbbc4564c91b9483bcc074ef453717da2974e | [
"MIT"
] | null | null | null | _pages/splash-page.md | gauravdiwan89/gauravdiwan89.github.io | 5f3bbbc4564c91b9483bcc074ef453717da2974e | [
"MIT"
] | null | null | null | _pages/splash-page.md | gauravdiwan89/gauravdiwan89.github.io | 5f3bbbc4564c91b9483bcc074ef453717da2974e | [
"MIT"
] | null | null | null | ---
title: "Splash Page"
layout: splash
title: "Welcome"
author_profile: true
permalink: /
intro:
- excerpt: '<br/><br/>Welcome to my website! I am a Computational Biologist working at the intersection of Protein Structure, Evolutionary Biology and Phylogenetics. This website contains my major research directions and achievements. Cheers!'
#feature_row:
# - image_path: assets/images/unsplash-gallery-image-1-th.jpg
# alt: "placeholder image 1"
# title: "Placeholder 1"
# excerpt: "This is some sample content that goes here with **Markdown** formatting."
# - image_path: /assets/images/unsplash-gallery-image-2-th.jpg
# image_caption: "Image courtesy of [Unsplash](https://unsplash.com/)"
# alt: "placeholder image 2"
# title: "Placeholder 2"
# excerpt: "This is some sample content that goes here with **Markdown** formatting."
# url: "#test-link"
# btn_label: "Read More"
# btn_class: "btn--primary"
# - image_path: /assets/images/unsplash-gallery-image-3-th.jpg
# title: "Placeholder 3"
# excerpt: "This is some sample content that goes here with **Markdown** formatting."
feature_row2:
- image_path: ../images/og_pa_tree_big.png
alt: "phylogeny"
title: "Evolutionary history of biological functions"
excerpt: 'Throughout my research career, I have been interested in tracing the evolutionary history of major biological functions. This is also one of the major central questions of Evolutionary Biology. Given the burst of genomic and proteomic information that is available in public repositories such as NCBI and UniProt, we are in an opportune era where one can trace the detailed evolutionary histories of major biological functions by using principles of Orthology, Phylogenetics and Molecular Evolution. For instance, in my PhD work, I traced the [evolutionary history of wobble base pairing in bacteria](https://academic.oup.com/mbe/article/35/8/2046/5017355) and discovered the dynamic evolution of the major enzymes involved in this process. I also found several other genomic correlates that reflect the evolutionary history of these proteins. In my current work, I am now focusing on tracing the evolutionary history of many major functions across the entire tree of life.'
#url: ""
#btn_label: "Read More"
#btn_class: "btn--primary"
feature_row3:
- image_path: ../images/structure.png
alt: "structure"
title: "Impact of mutations on protein function"
excerpt: 'Proteins are the most fundamental molecule in biology as any perturbations to these molecules lead to the improper functioning of the biological entity. A fundamental quest in biology is to elucidate the impact of different kinds of changes in the amino acid sequence of proteins on the three-dimensional structure and thereby function of the protein. A plethora of methods exist that rely on evolutionary conservation of protein sequences to try and answer this fundamental question. However, very few methods make use of structural information to understand and interpret the impact of mutations/variants. This field of variant interpretation may also be boosted by the recent availability of highly accurate computational models courtesy of [AlphaFold](https://alphafold.ebi.ac.uk/). However, [we noted and reported that the impact of these models may be lesser than anticipated](https://www.sciencedirect.com/science/article/pii/S0022283621004137). Thus, falling short of solving the three-dimensional structures of thousands of proteins and their respective variants, my aim is to develop a framework that uses the already available structural data (experimental and modelled) to infer how mutations may impact protein structure and function. A useful outcome of this exercise would the ability to build genotype-phenotype maps of genetic disorders, as exemplified by our efforts in [predicting the 3D structure of the NBAS protein](https://www.nature.com/articles/s41436-019-0698-4) involved in Infantile Liver Failure Syndrome - 2.'
#url: ""
#btn_label: "Read More"
#btn_class: "btn--primary"
feature_row4:
- image_path: ../images/scatter.png
alt: "scatterplot"
title: "Biological Data Analysis"
excerpt: 'The amount of data available today is tremendous for almost every aspect of human life. In the same vein, the massive amount of data available through biological observations and experiments requires further analysis to gain insights. While most sources that generate these data are focused on analysing the data for their own purposes, my goal is to compare and analyze such data from different sources to derive a more general understanding of a biological phenomenon. For instance, I am currently analysing large scale proteomics datasets from a variety of organisms to come up with a metric that measures the difficulty proteins face while folding during the conditions of the experiment. These measures then would give an idea about whether certain proteins are particularly notorious while being handled in the laboratory, and if there is a way in which these could be handled more easily.'
#url: ""
#btn_label: "Read More"
#btn_class: "btn--primary"
---
{% include carousel.html height="35" unit="%" duration="6" %}
{% include feature_row id="intro" type="center" %}
{% include feature_row %}
{% include feature_row id="feature_row2" type="left" %}
{% include feature_row id="feature_row3" type="right" %}
{% include feature_row id="feature_row4" type="left" %}
| 87.142857 | 1,553 | 0.773588 | eng_Latn | 0.994525 |
61edaa64496dc957bf40db66ef59d382cddea2a6 | 3,097 | md | Markdown | README.md | Ornidon/MonoPollY | b513bb6fcfcc4189f85bdd96ae753372521d3449 | [
"CC-BY-3.0"
] | null | null | null | README.md | Ornidon/MonoPollY | b513bb6fcfcc4189f85bdd96ae753372521d3449 | [
"CC-BY-3.0"
] | null | null | null | README.md | Ornidon/MonoPollY | b513bb6fcfcc4189f85bdd96ae753372521d3449 | [
"CC-BY-3.0"
] | null | null | null | # MonoPollY - Gestion de sondages
## Introduction
Cette application a pour but de permettre à différents utilisateurs de créer et de participer à différents questionnaires et de voir différentes statistiques à propos de ceux-ci
Voici la liste des principales caractéristiques demandés :
- En tant qu'enquêteur, je peux poser des questions à une audience.
- En tant qu'auditeur, je peux répondre aux questions.
- En tant que participant, je peux voir les réponses de l'audience.
- En tant qu'invité je peux créer un compte et m'enregistrer.
- En tant qu'utilisateur enregistré je peux me connecter ou me déconnecter.
- En tant qu'auditeur je peux m'inscrire dans une salle (audience).
- En tant qu'enquêteur je peux créer une salle.
- En tant qu'enquêteur je peux créer des questions dans ma salle.
Voici le flux de navigation attendu pour cette application :

Les pages suivantes sont accessible lorsque l'utilisateur n'est pas authentifié :
- Index
- Login
- Register
Les pages suivantes ne sont accessible que lorsque l'utilisateur est authentifié :
- Polls
Un utilisateur est définit par :
- Un nom d'utilisateur
- Un mot de passe
## Pour déployer l'application
### Prérequis
Pour lancer et construire le projet du client il vous faut :
- node 4.6.1
- npm 2.15.9
Pour lancer et construire le projet du serveur il vous faut:
- java 1.8
- maven 3.3.9
### Déploiment client
1. Il vous faut cloner le répertoire (branche : frontend)
2. Ouvrir un terminal dans le répertoire cloné
3. Lancer les commandes suivantes :
```bash
$> npm install
$> npm run serve
```
4. Ouvrir votre navigateur
5. Aller à l'adresse `localhost:3000`
### Construire un client déployable
1. Lancer la commande suivante:
```bash
$> npm run build
```
2. le client sera dans le fichier /dist
**Remarque**: Seul le client est lancé. Pour lancer le serveur avec la base de donnée veuillez suivre la procédure de Déploiment serveur. Il est possible de changer l'adresse du serveur en éditant le fichier config.json
### Déploiment serveur
1. Il vous faut cloner le répertoire (branche : server)
2. Ouvrir un terminal dans le répertoire cloné
3. Lancer la commande suivante :
```bash
$> ./start.sh
```
4. Ouvrir votre navigateur
5. Aller à l'adresse `localhost:8080/api`
**Remarque**: Vous pouvez configurer le secret des JSON web token en éditant le fichier JWT.properties.
## Utilisation
1. Vous pouvez soit :
- Déployer l'application vous même
- Ouvrir votre navigateur et vous rendre sur http://monopollytweb.herokuapp.com/
2. Connectez-vous.
3. Profitez !
## Technologies
Pour ce projet nous utilisons les technologies suivantes :
- Angular2
- Typescript
- WebPack
- MySQL
- SpringBoot
## Pour plus d'informations
Visitez notre page de présentation :
- https://ornidon.github.io/MonoPollY/
## Auteurs
Ioannis Noukakis & Thbaut Loiseau
| 27.651786 | 220 | 0.722958 | fra_Latn | 0.989945 |
61edbe8244281b6103385e254e7dad1b315191e1 | 6,582 | md | Markdown | docs/docs/dev-server/cli-and-configuration.md | yhjhoo/web | 0d0c006f909473bde1d56240a6ea1490bc371abb | [
"MIT"
] | 1,189 | 2020-06-02T21:39:57.000Z | 2022-03-31T07:43:50.000Z | docs/docs/dev-server/cli-and-configuration.md | yhjhoo/web | 0d0c006f909473bde1d56240a6ea1490bc371abb | [
"MIT"
] | 1,402 | 2020-06-05T10:46:26.000Z | 2022-03-31T19:26:09.000Z | docs/docs/dev-server/cli-and-configuration.md | yhjhoo/web | 0d0c006f909473bde1d56240a6ea1490bc371abb | [
"MIT"
] | 183 | 2020-06-24T07:27:37.000Z | 2022-03-31T10:11:22.000Z | # Dev Server >> CLI and Configuration ||2
The dev server can be configured using CLI flags, or with a configuration file.
## CLI flags
| name | type | description |
| ----------------- | ------------ | -------------------------------------------------------------------------------------------------------------------- |
| config | string | where to read the config from |
| root-dir | string | the root directory to serve files from. Defaults to the current working directory |
| base-path | string | prefix to strip from requests URLs |
| open | string | Opens the browser on app-index, root dir or a custom path |
| app-index | string | The app's index.html file. When set, serves the index.html for non-file requests. Use this to enable SPA routing |
| preserve-symlinks | boolean | preserve symlinks when resolving imports |
| node-resolve | boolean | resolve bare module imports |
| watch | boolean | runs in watch mode, reloading on file changes |
| port | number | Port to bind the server to |
| hostname | string | Hostname to bind the server to |
| esbuild-target | string array | JS language target to compile down to using esbuild. Recommended value is "auto", which compiles based on user-agent |
| debug | boolean | whether to log debug messages |
| help | boolean | List all possible commands |
Examples:
```
web-dev-server --open
web-dev-server --node-resolve
web-dev-server --node-resolve --watch
web-dev-server --node-resolve --watch --app-index demo/index.html
web-dev-server --node-resolve --watch --app-index demo/index.html --esbuild-target auto
```
You can also use the shorthand `wds` command:
```
wds --node-resolve --watch --app-index demo/index.html --esbuild-target auto
```
## esbuild target
The `--esbuild-target` flag uses the [@web/dev-server-esbuild plugin](https://modern-web.dev/docs/dev-server/plugins/esbuild/) to compile JS to a compatible language version. Depending on what language features you are using and the browsers you are developing on, you may not need this flag.
If you need this flag, we recommend setting this to `auto`. This will compile based on user-agent, and skip work on modern browsers. [Check the docs](https://modern-web.dev/docs/dev-server/plugins/esbuild/) for all other possible options.
## Configuration file
Web Dev Server looks for a configuration file in the current working directory called `web-dev-server.config`.
The file extension can be `.js`, `.cjs` or `.mjs`. A `.js` file will be loaded as an es module or common js module based on your version of node, and the package type of your project.
We recommend writing the configuration using [node js es module](https://nodejs.org/api/esm.html) syntax and using the `.mjs` file extension to make sure your config is always loaded correctly. All the examples in our documentation use es module syntax.
A config written as es module `web-dev-server.config.mjs`:
```js
export default {
open: true,
nodeResolve: true,
appIndex: 'demo/index.html'
// in a monorepo you need to set set the root dir to resolve modules
rootDir: '../../',
};
```
A config written as commonjs `web-dev-server.config.js`:
```js
module.exports = {
open: true,
nodeResolve: true,
appIndex: 'demo/index.html'
// in a monorepo you need to set set the root dir to resolve modules
rootDir: '../../',
};
```
A configuration file accepts most of the command line args camel-cased, with some extra options. These are the full type definitions:
```ts
import { Plugin, Middleware } from '@web/dev-server';
type MimeTypeMappings = Record<string, string>;
interface DevServerConfig {
// whether to open the browser and/or the browser path to open on
open?: 'string' | boolean;
// index HTML to use for SPA routing / history API fallback
appIndex?: string;
// run in watch mode, reloading when files change
watch?: boolean;
// resolve bare module imports
nodeResolve?: boolean | RollupNodeResolveOptions;
// JS language target to compile down to using esbuild. Recommended value is "auto", which compiles based on user agent.
esbuildTarget?: string | string[];
// preserve symlinks when resolve imports, instead of following
// symlinks to their original files
preserveSymlinks?: boolean;
// the root directory to serve files from. this is useful in a monorepo
// when executing commands from a package
rootDir?: string;
// prefix to strip from request urls
basePath?: string;
/**
* Whether to log debug messages.
*/
debug?: boolean;
// files to serve with a different mime type
mimeTypes?: MimeTypeMappings;
// middleware used by the server to modify requests/responses, for example to proxy
// requests or rewrite urls
middleware?: Middleware[];
// plugins used by the server to serve or transform files
plugins?: Plugin[];
// configuration for the server
protocol?: string;
hostname?: string;
port?: number;
// whether to run the server with HTTP2
http2?: boolean;
// path to SSL key
sslKey?: string;
// path to SSL certificate
sslCert?: string;
}
```
### Node resolve options
The `--node-resolve` flag uses [@rollup/plugin-node-resolve](https://github.com/rollup/plugins/tree/master/packages/node-resolve) to resolve module imports.
You can pass extra configuration using the `nodeResolve` option in the config:
```js
export default {
nodeResolve: {
exportConditions: ['development'],
},
};
```
| 46.027972 | 292 | 0.580675 | eng_Latn | 0.976204 |
61edc23ea6b80b12a7bc3149ed5552ebaeb09b7e | 1,032 | md | Markdown | src/pages/blog/credit-cards-and-phone-cards.md | bekovicdusan/Gatsby | 42075d2cc88793a21c8f3595a58d2bf041afb210 | [
"MIT"
] | 10 | 2020-06-06T08:47:19.000Z | 2021-12-15T20:08:59.000Z | src/pages/blog/credit-cards-and-phone-cards.md | bekovicdusan/Gatsby | 42075d2cc88793a21c8f3595a58d2bf041afb210 | [
"MIT"
] | 44 | 2020-05-19T02:14:15.000Z | 2022-03-29T20:43:54.000Z | src/pages/blog/credit-cards-and-phone-cards.md | bekovicdusan/Gatsby | 42075d2cc88793a21c8f3595a58d2bf041afb210 | [
"MIT"
] | 10 | 2020-06-19T10:05:01.000Z | 2021-08-29T18:39:49.000Z | ---
templateKey: blog-post
title: Credit Cards and Phone Cards
date: 2004-07-29
path: /credit-cards-and-phone-cards
featuredpost: false
featuredimage:
tags:
- credit-cards
- phone-cards
category:
- Iraq
comments: true
share: true
---
Some more FYI info for anybody getting called up...
There are ATM machines as far as Kuwait, but none in Iraq that I've seen. For cash, you need to go to Finance and either cash a check or get a draw from your pay. Nobody in Iraq takes credit cards that I've seen. On the other hand, there's not all that much to buy...
For telephone, nothing here beats the Segovia IP phones. You get about 10 hours for $25 - something like $.05 per minute. AT&T phones are common, too, but the AT&T phone cards that have domestic minutes get used up 10 or 20 times as fast for international calls, so a 300 minute card would last an hour if you're lucky. So, get a phone card or two for en route, but once you're hear, go with Segovia (which you buy online and just get a number to use, no actual card).
| 46.909091 | 468 | 0.749031 | eng_Latn | 0.998323 |
61ee8ae16410d0616bfbdaf25d2adbad4e6dd2f8 | 3,160 | md | Markdown | site/guides/1_the_basics/3_writing_to_stores.md | jamesgpearce/tinybase | 0847e1a8ea96f46b2002df50d7a9f0e723847a84 | [
"MIT"
] | 968 | 2022-01-10T19:18:12.000Z | 2022-03-29T19:20:14.000Z | site/guides/1_the_basics/3_writing_to_stores.md | jamesgpearce/tinybase | 0847e1a8ea96f46b2002df50d7a9f0e723847a84 | [
"MIT"
] | 9 | 2022-01-15T12:18:50.000Z | 2022-02-14T19:18:35.000Z | site/guides/1_the_basics/3_writing_to_stores.md | jamesgpearce/tinybase | 0847e1a8ea96f46b2002df50d7a9f0e723847a84 | [
"MIT"
] | 18 | 2022-01-15T01:38:32.000Z | 2022-02-22T01:56:17.000Z | # Writing To Stores
This guide shows you how to write data to a Store.
A Store has a simple hierarchical structure:
- The Store contains a number of Table objects.
- Each Table contains a number of Row objects.
- Each Row contains a number of Cell objects.
Once you have created a store, you can write data to it with one of its setter
methods, according to the level of the hierarchy that you want to set.
For example, you can set the data for the entire Store with the setTables
method:
```js
const store = createStore();
store.setTables({pets: {fido: {species: 'dog'}}});
```
Hopefully self-evidently, this sets a whole Store to have one Table object
(called `pets`), containing one Row object (called `fido`), containing one Cell
object (called `species` and with the string value `dog`):
```js
console.log(store.getTables());
// -> {pets: {fido: {species: 'dog'}}}
```
You can also alter Store data at different granularities with the setTable
method, the setRow method, and the setCell method:
```js
store.setTable('species', {dog: {price: 5}});
console.log(store.getTables());
// -> {pets: {fido: {species: 'dog'}}, species: {dog: {price: 5}}}
store.setRow('species', 'cat', {price: 4});
console.log(store.getTables());
// -> {pets: {fido: {species: 'dog'}}, species: {dog: {price: 5}, cat: {price: 4}}}
store.setCell('pets', 'fido', 'color', 'brown');
console.log(store.getTables());
// -> {pets: {fido: {species: 'dog', color: 'brown'}}, species: {dog: {price: 5}, cat: {price: 4}}}
```
A Cell can be a string, a number, or a boolean value.
It's worth mentioning here that there are two extra methods to manipulate Row
objects. The addRow method is like the setRow method but automatically assigns
it a new unique Id. And the setPartialRow method lets you update multiple Cell
values in a Row without affecting the others.
## Deleting Data
There are dedicated deletion methods (again, for each level of granularity),
such as the delTable method, the delRow method, and the delCell method. For
example:
```js
store.delTable('species');
console.log(store.getTables());
// -> {pets: {fido: {species: 'dog', color: 'brown'}}}
```
Deletions are also implied when you set an object that omits something that
existed before:
```js
console.log(store.getTables());
// -> {pets: {fido: {species: 'dog', color: 'brown'}}}
store.setRow('pets', 'fido', {species: 'dog'});
console.log(store.getTables());
// -> {pets: {fido: {species: 'dog'}}}
// The `color` Cell has been deleted.
```
Table and Row objects cannot be empty - if they are, they are removed - which
leads to a cascading effect when you remove the final child of a parent object:
```js
store.delCell('pets', 'fido', 'species');
console.log(store.getTables());
// -> {}
// The `fido` Row and `pets` Table have been recursively deleted.
```
## Summary
That's a quick overview on how to write data to a Store. But of course you want
to get it out too!
In the examples above, we've used the getTables method to get a view of all the
data in the Store. Unsurprisingly, you can also use more granular methods to get
data out - for which we proceed to the Reading From Stores guide.
| 32.244898 | 99 | 0.706013 | eng_Latn | 0.99404 |
61eed55c2fdb486f2805c2d32384f2740bf758f7 | 2,397 | md | Markdown | README.md | ceskillets/DCV-Basic-SSL-Decrypt-Policy | 8278f5ecd09d6cb7de97a10136f45cfd7e1c61f6 | [
"MIT"
] | null | null | null | README.md | ceskillets/DCV-Basic-SSL-Decrypt-Policy | 8278f5ecd09d6cb7de97a10136f45cfd7e1c61f6 | [
"MIT"
] | 1 | 2019-10-14T03:48:10.000Z | 2019-10-14T03:48:10.000Z | README.md | ceskillets/DCV-Basic-SSL-Decrypt-Policy | 8278f5ecd09d6cb7de97a10136f45cfd7e1c61f6 | [
"MIT"
] | null | null | null | # DCV-Basic-SSL-Decrypt-Policy
This skillet creates a basic ssl decryption policy that installs a self-signed certificate and a decryption policy that decrypts high-risk categories and the eicar antivirus test site.
This skillet pushes a self-signed certificate to a firewall, configures it as the forward-trust-certificate and creates a decryption policy and profile. It also creates a custom URL category for the *.eicar.org website and enables high-risk and this custom category for decryption. This skillet is meant mostly for POC or demonstration purposes. Production environments are likely to leverage an internal PKI environment to perform decryption instead of using a self-signed certificate.
To use the skillet, select the VSYS (default vsys1) and deploy to the firewall. If testing decryption with AntiVirus, browse to the eicar test virus site through the firewall and download the test file using https. If you have not imported the certificate in the system’s local trusted certificate store, you will receive a certificate error. The download will be blocked. The other high-risk URL categories that are part of the decryption rule are:
• Adult
• Command-and-Control
• Content-Delivery-Networks
• Copyright-Infringement
• Dynamic-DNS
• Gambling
• Hacking
• High-Risk
• Malware
• Parked
• Peer-to-Peer
• Proxy-Avoidance-and-Anonymizers
• Shareware-and-Freeware
• Social-Networking
• Unknown
Support Policy
The code and templates in the repo are released under an as-is, best effort, support policy. These scripts should be seen as community supported and Palo Alto Networks will contribute our expertise as and when possible. We do not provide technical support or help in using or troubleshooting the components of the project through our normal support options such as Palo Alto Networks support teams, or ASC (Authorized Support Centers) partners and backline support options. The underlying product used (the VM-Series firewall) by the scripts or templates are still supported, but the support is only for the product functionality and not for help in deploying or using the template or script itself. Unless explicitly tagged, all projects or work posted in our GitHub repository (at https://github.com/PaloAltoNetworks) or sites other than our official Downloads page on https://support.paloaltonetworks.com are provided under the best effort policy.
| 59.925 | 950 | 0.806425 | eng_Latn | 0.998686 |
61ef3282c9cf68d602b210b41bb00213d2663c36 | 220 | md | Markdown | README.md | ernestas-poskus/queue | be8ff0e77cbf10ba92236b6662f5ad99fb80d66d | [
"MIT"
] | 4 | 2017-04-20T14:16:00.000Z | 2019-05-09T00:57:08.000Z | README.md | ernestas-poskus/queue | be8ff0e77cbf10ba92236b6662f5ad99fb80d66d | [
"MIT"
] | null | null | null | README.md | ernestas-poskus/queue | be8ff0e77cbf10ba92236b6662f5ad99fb80d66d | [
"MIT"
] | null | null | null | queue
=====
Golang channel queue-worker implementation
- Dispatcher: initializes queue & workers
- Collector: distributes work requests to Dispatcher
- Worker: performs unit of work
- Job: interface for implementation
| 22 | 52 | 0.786364 | eng_Latn | 0.94758 |
61ef8b31fa5a755965a5da3fc945915d6b03f3b5 | 800 | md | Markdown | catalog/shoujo-ga-kowareta-toki/en-US_shoujo-ga-kowareta-toki.md | htron-dev/baka-db | cb6e907a5c53113275da271631698cd3b35c9589 | [
"MIT"
] | 3 | 2021-08-12T20:02:29.000Z | 2021-09-05T05:03:32.000Z | catalog/shoujo-ga-kowareta-toki/en-US_shoujo-ga-kowareta-toki.md | zzhenryquezz/baka-db | da8f54a87191a53a7fca54b0775b3c00f99d2531 | [
"MIT"
] | 8 | 2021-07-20T00:44:48.000Z | 2021-09-22T18:44:04.000Z | catalog/shoujo-ga-kowareta-toki/en-US_shoujo-ga-kowareta-toki.md | zzhenryquezz/baka-db | da8f54a87191a53a7fca54b0775b3c00f99d2531 | [
"MIT"
] | 2 | 2021-07-19T01:38:25.000Z | 2021-07-29T08:10:29.000Z | # Shoujo ga Kowareta Toki

- **type**: manga
- **volumes**: 1
- **original-name**: 少女が壊れたとき
- **start-date**: 1994-01-19
## Tags
- horror
- supernatural
## Authors
- Kawaguchi
- Madoka (Story & Art)
## Sinopse
A detective hunts for clues on the impossible "alibi" of the mysterious girl, a suspect for the murder of her own parents, who testifies that she has the uncontrollable and spontaneous ability to transform into literally anything. This collection of short stories is filled with mysterious and fantastic plot, introducing you to the fantasy worlds of Kawaguchi Madoka!
(Source: MangaHelpers)
## Links
- [My Anime list](https://myanimelist.net/manga/94311/Shoujo_ga_Kowareta_Toki)
| 27.586207 | 368 | 0.7325 | eng_Latn | 0.945994 |
61f0241098dbadad5f3be04cf59c59381831dec4 | 1,790 | md | Markdown | docs/files/lib_docview.js.md | cocos-creator/firedoc | 9fa221862eb500e3f190dc63564d3597f515b1e4 | [
"BSD-3-Clause"
] | 49 | 2016-01-19T02:50:58.000Z | 2020-05-07T13:52:09.000Z | docs/files/lib_docview.js.md | fireball-x/firedoc | 9fa221862eb500e3f190dc63564d3597f515b1e4 | [
"BSD-3-Clause"
] | 41 | 2015-05-05T08:27:23.000Z | 2015-12-14T15:57:14.000Z | docs/files/lib_docview.js.md | cocos-creator/firedoc | 9fa221862eb500e3f190dc63564d3597f515b1e4 | [
"BSD-3-Clause"
] | 17 | 2016-02-04T08:28:51.000Z | 2018-06-05T11:52:07.000Z |
# firedoc 1.9.2
Fireball is the game engine for the future.
### File: ``
```js
/* global YUI */
/**
* The firedoc module
* @module firedoc
*/
const _ = require('underscore');
const path = require('path');
const Handlebars = require('handlebars');
/**
* View class borrowed from [Selleck](https://github.com/rgrove/selleck)
* The view class is a **`handlebars`** template helper.
*
* @class DocView
* @constructor
* @param {Object} data Meta data to use in this template
* @param {String} templateName The name of the template file to render.
**/
function DocView (data, templateName, cwd) {
this.templateName = templateName;
this.cwd = path.join(cwd || '');
this.assets = path.join(cwd || '', 'assets');
_.extend(this, data);
// register helpers
var self = this;
Handlebars.registerHelper('relink', function (item, options) {
item = item || '';
if (self.project.local) {
return '//' + self.project.root + '/' + item;
} else {
return self.project.baseurl + '/' + item;
}
});
}
DocView.prototype = {
/**
* **Mustache** `lambda` method for setting the HTML title
* @method htmlTitle
*/
htmlTitle: function () {
var name = this.name;
var title = name;
try {
if (title) {
if (this.project.name) {
title += ' - ' + this.project.name;
}
} else {
title = this.project.name;
}
} catch (e) {}
return title;
},
/**
* **Mustache** `lambda` method for setting the title
* @method title
*/
title: function () {
var name = this.name;
var title = name;
try {
title = this.project.name;
if (name) {
title += ': ' + name;
}
} catch (e) {}
return title;
}
};
exports.DocView = DocView;
```
| 20.11236 | 74 | 0.574302 | eng_Latn | 0.824199 |
61f17b2fed7cea8e0ed39097612b11905fd9a2c5 | 1,732 | md | Markdown | examples/peg_peg/README.md | pombredanne/Arpeggio | d7380ed4171663f2b027c356495a5ca352fc3fc8 | [
"MIT"
] | 127 | 2015-01-01T14:02:29.000Z | 2018-12-27T12:59:09.000Z | examples/peg_peg/README.md | pombredanne/Arpeggio | d7380ed4171663f2b027c356495a5ca352fc3fc8 | [
"MIT"
] | 46 | 2018-12-27T17:20:58.000Z | 2022-03-20T16:38:30.000Z | examples/peg_peg/README.md | pombredanne/Arpeggio | d7380ed4171663f2b027c356495a5ca352fc3fc8 | [
"MIT"
] | 30 | 2015-06-22T03:57:39.000Z | 2018-07-07T08:15:06.000Z | # PEG in PEG example
This example demonstrates that PEG language can be defined in PEG.
In `peg.peg` file a grammar of PEG language is defined. This grammar is loaded
and the parser is constructed using `ParserPEG` class from `arpeggio.peg`
module.
Semantic analysis in the form of `PEGVisitor` class is applied to the parse tree
produced from the `peg.peg` grammar. The result of the semantic analysis is
a parser model which should be the same as the one that is used to parse the
`peg.peg` grammar in the first place.
To verify that we have got the same parser, the parser model is replaced with
the one produced by semantic evaluation and the parsing of `peg.peg` is
repeated.
To run the example do:
```bash
$ python peg_peg.py
```
This example runs in debug mode (`debug` is `True` in `ParserPEG` constructor
call) and detailed log is printed and `dot` files are produced. `dot` files
will be named based on the root grammar rule.
You can visualize `dot` files with some dot file viewers (e.g.
[ZGRViewer](http://zvtm.sourceforge.net/zgrviewer.html)) or produce graphics
with `dot` tool (you need to install [GraphViz](http://www.graphviz.org/) for that)
```bash
$ dot -Tpng -O *dot
```
You should verify that there are three parser model graphs and all the three are
the same:
- parser model produced by the canonical PEG grammar definition specified in
`arpeggio.peg` module using [Python
notation](http://textx.github.io/Arpeggio/grammars/#grammars-written-in-python).
- parser model produced by the `ParserPEG` construction that represent the
grammar loaded from the `peg.peg` file.
- parser model created by applying `PEGVisitor` explicitly to the parse tree
produced by parsing the `peg.peg` file.
| 36.083333 | 83 | 0.761547 | eng_Latn | 0.998151 |
61f2aa1930ad6b3a422137b5e5c772765c914c12 | 12,220 | md | Markdown | README.md | TheRedDaemon/LittleCrusaderAsi | da98790f68422d359dc93c198e2a79ec60719007 | [
"MIT"
] | 4 | 2020-11-30T21:53:57.000Z | 2022-03-18T05:14:48.000Z | README.md | TheRedDaemon/LittleCrusaderAsi | da98790f68422d359dc93c198e2a79ec60719007 | [
"MIT"
] | 3 | 2020-10-22T20:40:04.000Z | 2021-10-05T14:24:22.000Z | README.md | TheRedDaemon/LittleCrusaderAsi | da98790f68422d359dc93c198e2a79ec60719007 | [
"MIT"
] | null | null | null | # LittleCrusaderAsi
## Table of contents
* [Summary](#summary)
* [Installation](#installation)
* [Getting *LittleCrusaderAsi*](#getting-littlecrusaderasi)
* [Getting an AsiLoader](#getting-an-asiloader)
* [Location of files and configuration](#location-of-files-and-configuration)
* [Current Features](#current-features)
* [FAQ](#faq)
* [Credits](#credits)
* [Additional Information](#additional-information)
* [License](#license)
* [Dependencies](#dependencies)
* [Other Crusader projects](#other-crusader-projects)
* [Special Thanks](#special-thanks)
## Summary
*LittleCrusaderAsi* is a modification for Stronghold Crusader 1 inspired by the awesome work on the [UnofficialCrusaderPatch][1]. The later provides a ton of features and is quite flexible when it comes to version compatibility. However, the patcher modifies the executable, which requires restarts if something needs changes.
The *LittleCrusaderAsi* tries to tackle this issue by using a very common technique when it comes to modding. The modification uses an AsiLoader (see [Installation](#installation)) to load a Dynamic Link Library (dll, just named asi in this approach) into the address space of a program. That allows to execute additional code and modify the software during runtime.
The implementation tries to grant individual features a lot of freedom how to handle issues like missing addresses, so version compatibility is provided on a per feature basis.
That being said, as long as no one with other versions then 1.41 and 1.41.1-E digs through the addresses, the mods will mostly be limited to the later. Additionally, the restriction is further defined by the mod being so far created using a German crusader version. As a result:
**Full compatibility so far is only guaranteed for Crusader versions 1.41 and 1.41.1-E and only if they are codewise identical to the german version. (A guess would be that this includes all using the latin alphabet, like the english version.)** Other "versions" of 1.41 and 1.41.1-E might even create crashes at the moment, since the current system might not know how to handle them. (There are reports of problems with the Russian version and the UnofficialCrusaderPatch, so this might be a candidate.) In all other cases (for example version 1.2), the feature should simply not work and the issue will be written to a log file.
For more information on the implementation approach and structure, please take a look at the [README](src/README.md) inside the *src*-folder. I hope it is a good read for everyone who wants to contribute. ^^
## Installation
### Getting LittleCrusaderAsi
Go [HERE](https://github.com/TheRedDaemon/LittleCrusaderAsi/releases/latest) to get the latest release.
The project can also be build from source. It is written in C++17 and the repository contains a Visual Studio 2019 project. One must make sure to use the Win32 mode and (for DEBUG mode) redefine *Properties->General->Output Directory* and *Properties->Debugging->Command*, since both are currently configured to allow debugging the dll and point therefore in the file structure of the latest who edited the project file.
### Getting an AsiLoader
*LittleCrusaderAsi* relies on a third party tool that loads it into the process. Basically only one tool was tested, but more or less two variants are possible:
* Using [Ultimate-ASI-Loader](https://github.com/ThirteenAG/Ultimate-ASI-Loader/releases) directly. The release will contain a 'dinput8.dll' which needs to be placed in the game root directory. However, it needs to be renamed into a dll the game loads. One that works is 'ddraw.dll'. Other names might be possible, but were not tested. Now the game will load ASI files placed in the game root directory or the folders 'scripts' or 'plugins', if they are created inside the game directory.
* Using [DxWrapper](https://github.com/elishacloud/dxwrapper), which uses code of the Ultimate-ASI-Loader to provide the same feature alongside a ton more, like [Dd7to9](https://github.com/elishacloud/dxwrapper/wiki/DirectDraw-to-Direct3D9-Conversion). The later allows stuff like running the game in window mode (with a few issues, like the cursor not being bound to the window). For the installation, please refer to the DxWrapper documentation. In short, it is required that the provided 'ddraw.dll' is placed in the game root folder alongside 'dxwrapper.dll' and 'dxwrapper.ini'. The asi-loading needs to be activated in the 'dxwrapper.ini'. It also supports the folders 'scripts' and 'plugins'.
Both have also additional features, but if interested, please refer to their documentations. Furthermore, loading *LittleCrusaderAsi* into the process by other means may also work, but there are no guarantees.
### Location of files and configuration
First, all now mentioned files need to be placed in the same directory. This might either be the game root, or one of the supported folders 'scripts'/'plugins' (recommended).
Currently three files are needed:
1. The 'LittleCrusaderAsi.asi' of course.
2. The 'logger.config', which is the configuration for the logger. Simply using the file [HERE](src/LittleCrusaderAsi/dependencies/easylogging++/logger.config) is enough. For customisation please refer to the easylogging++ documentation (see [Dependencies](#dependencies)). Omitting it will result in the use of a default configuration.
3. The actual 'modConfig.json'. This file contains the mod configuration. [THIS](ConfigurationDescription) non valid json-file will contain examples for every possible adjustments and features.
If the asi-file is missing, it will not work. If the modConfig is missing or invalid the mod will crash with a log.
Log output is written to a file named 'crusaderAsi.log', which will be overwritten on every start and generated in the same folder as the other files. In case of a noticed exception or if requested by a feature, the log will be copied and renamed on exit. Hard crashes (in case of corrupted memory for example) will likely not show up in the log.
At the moment, configuring the modification is kinda clunky. The focus for now is on functionality instead of usability. However, if anyone wants to provide a GUI for generating the configuration json, feel free. ^^
In the future it might also be possible to define the configuration in the overlay menu and save it at the end, but this is currently not possible.
## Current Features
A list of the features currently provided by the *LittleCrusaderAsi*. The short summery will not contain in-depth details regarding the configuration. Please search in the [Example Json](ConfigurationDescription) after the feature name to get information about the configuration.
**KeyboardInterceptor**
*Supported versions:* All (in theory)
The keyboard interceptor has two features. First, implementation utility. It allows other mods to register key functions and to receive text input. Second, it allows to redefine keypresses. The later is directly handled through the configuration of this mod. It is therefore possible to tranform a press on the 'a'-key to a press on 'b'. However, this is only possible for single keys and is done by catching the actual key and telling the game another was pressed. As a result, combinations like 'shift+a' are carried over (for example would become 'shift+b') and text inputs are unreliable. To circumvent this issue the interceptor as a whole has a configurable on/off-key.
**BltOverlay**
*Supported versions:* V1.41.1-E | V1.41
Not directly a feature that changes the game, but the most noticeable utility. Mods are now able to provide menus that can be accessed by using an overlay menu. It also provides a console window that can display some log messages. For more informations on the current sub-menus, please refer to [THIS](MenuDescription.md) file.
**BuildRangeChanger**
*Supported versions:* V1.41.1-E | V1.41
A simple modification that allows to change the castle build range. Can be configured to be active at start, can register a key function used for activation (with activated KeyboardInterceptor) or be configured in more detail by using the [*BltOverlay* menu](MenuDescription.md). Ranges are defined by map size. Key function allows toggle between vanilla and custom values. The AI will build walls regardless, but no towers if they fall out of range.
**AICLoad**
*Supported versions:* V1.41.1-E | V1.41
AIC loader of the *LittleCrusaderAsi*. The mod was created to support the file format defined by the UCP (2.14) ([Example with vanilla AIC](https://github.com/Sh0wdown/UnofficialCrusaderPatch/blob/master/UnofficialCrusaderPatch/AIC/Resources/vanilla.json)). This mod posseses a lot of features, among them the ablilty to load new AIC files, define their load order, a live editor and the possiblity to save the current custom values. Please check out the [Example Json](ConfigurationDescription) and the [Menu Description](MenuDescription.md) for more details.
Note that all AIC values are currently applied per value, so it is possible to carelessly create AI-amalgamations if one does only define half of the Rats values, while having an AIC with another Rat in line.
One last note. Not all log messages are displayed in the text console and there is also currently no way to scroll back. So if in doubt, check the log-file created in the mod directory.
## FAQ
A place for questions some might have.
1. **Is *LittleCrusaderAsi* compatible with the UCP?**
A test with version 0.1 of *LittleCrusaderAsi* and version 2.14 of the UCP (with not all but multiple features enabled) yielded no obvious errors. Version 0.2 was not tested so far, but the only changes to processes are to some of the main DirectDraw functions, which seem to be untouched by the UCP so far? I am rather sure there should be no issues. That being said, using AIC modifications in both the UCP and *LittleCrusaderAsi* might lead to unwanted overwrites. For example, the "default values" in the editor menu will be the values installed by the UCP.
## Credits
* [TheRedDaemon](https://github.com/TheRedDaemon) - Maintainer and Developer
* The people of the [UCP][1] - Indirect supporters, by finding knowledge that can be taken. :-)
## Additional Information
### License
This repository is licensed using the MIT License.
### Dependencies
LittleStrongholdAsi uses third party tools/code.
You will find them together with a copy of their licenses under [THIS](src/LittleCrusaderAsi/dependencies) folder.
Dependency | Used Version | Notes
------------ | ------------- | -------------
[Easylogging++](https://github.com/amrayn/easyloggingpp) | 9.96.7
[JSON for Modern C++](https://github.com/nlohmann/json) | 3.9.1
[DDFontEngine](https://realmike.org/blog/projects/fast-bitmap-fonts-for-directdraw) | - | Old Tutorial for DirectDraw fonts. Classes themselves unused, but kept for reference.
### Other Crusader projects
There are a handful of projects on GitHub for Stronghold (Crusader) 1. So take a look. After all, every development might be helpful for other projects.
(Note: Without any claim to comprehensiveness.)
* Was the [UnofficialCrusaderPatch][1] already mentioned? ^^
* The [Gm1KonverterCrossPlatform](https://github.com/PodeCaradox/Gm1KonverterCrossPlatform) is a tool developed for easier editing of all kinds of Crusader textures and sprites.
* While the [Sourcehold](https://github.com/sourcehold) main project seems to laying dormant at the moment (October 2020), it might still be worth a look.
* As an interesting subproject, [sourcehold-maps](https://github.com/sourcehold/sourcehold-maps) tries do analyze how the map files are structured and is still active (October 2020). Who knows what might be possible by only messing with the map files?
* Any more suggestions? Feel free to add them (as long as they are related to Stronghold at least)!
## Special Thanks
Basically everyone involved with and thanked by the [UnofficialCrusaderPatch][1]. So pass the thanks down the thanks tree. ^^
And especially [Firefly Studios](https://fireflyworlds.com/) for the creation of the Stronghold series. There can not be a mod without a game in the first place...
[1]:https://github.com/Sh0wdown/UnofficialCrusaderPatch
| 99.349593 | 699 | 0.781506 | eng_Latn | 0.998005 |
61f2fed53f85da087a0b043732f7c0a484cfa65d | 3,039 | md | Markdown | _pages/about.md | artegful/artegful.github.io | 953b5637df87b14333e792146a3043bc36a3f3bd | [
"MIT"
] | null | null | null | _pages/about.md | artegful/artegful.github.io | 953b5637df87b14333e792146a3043bc36a3f3bd | [
"MIT"
] | null | null | null | _pages/about.md | artegful/artegful.github.io | 953b5637df87b14333e792146a3043bc36a3f3bd | [
"MIT"
] | null | null | null | ---
title: "About Me"
permalink: /about/
layout: single
classes: wide
author_profile: true
---
<style>
.container {
margin: auto;
padding: 10px;
}
.left {
width: 50%;
float: left;
}
.right {
margin-left: 50%;
}
</style>
<p>
My name is <b>Klimovich Artsemi</b> and I am a Game Developer.<br><br>
Ever since I was a kid, games have been my passion. They have got me into programming and learning game engines. Games are a perfect combo of creativity and technology. I am mostly fond of second aspect, but first one never lets you get bored. Programming and designing game architecture is like a never-ending puzzle, that I am engaged in solving.
</p>
<h2>My path</h2>
<p>
My first team experience in game dev was when me and my classmate decided to create a simple mobile game, but lack of expertise brought our project down. That was the time, when I started working hard to become skilled enough to embody almost any game idea. One day I found a post about a small team of mobile developers looking for a novice, that can help them develop small features and exterminate bugs. That’s how I got my first (almost) professional contract. And here I am today, a developer at SKRGames, working on mobile games.
</p>
<h2>Hobbies</h2>
<p>
At home I continue to work, but on my own projects. Participating in game jams challenges my creativity, programming and problem solving skills. I like to browse and try other indie enthusiasts’ projects, because they are full of really inspiring and fresh ideas. My other hobby is films. I enjoy watching them, because they let you see the world from a different perspective and learn something new.
</p>
<h2>Skills</h2>
<div class="container">
<div class="left">
<h3>Programming Languages</h3>
<ul>
<li><img style="height:32px;width:32px;" src="/images/csharp-icon128x128.png"> | C#</li>
<li><img style="height:32px;width:32px;" src="/images/cpp-icon128x128.png"> | C++</li>
</ul>
</div>
<div class="right">
<h3>Languages</h3>
<ul>
<li><img style="height:32px;width:32px;" src="/images/russian-flag128x128.png"> | Russian</li>
<li><img style="height:32px;width:32px;" src="/images/gbr-flag128x128.png"> | English(upper-intermediate)</li>
</ul>
</div>
</div>
<h2>Software Experience</h2>
<div class="container">
<div class="left">
<h3>Technical</h3>
<ul>
<li><img style="height:32px;width:32px;" src="/images/vs-icon128x128.png"> | VS</li>
</ul>
</div>
<div class="right">
<h3>Game Engines</h3>
<ul>
<li><img style="height:32px;width:32px;" src="/images/unity3d-icon128x128.png"> | Unity</li>
</ul>
</div>
</div>
<div class="container">
<div class="left">
<h3>Source Control</h3>
<ul>
<li><img style="height:32px;width:32px;" src="/images/git-icon128x128.png"> | Git</li>
</ul>
</div>
<div class="right">
<h3>Management</h3>
<ul>
<li><img style="height:32px;width:32px;" src="/images/trello-icon128x128.png"> | Trello</li>
</ul>
</div>
</div>
| 34.534091 | 536 | 0.682461 | eng_Latn | 0.966464 |
61f36671edf61584d61af2c3620c7a0d48fe1830 | 173 | md | Markdown | _posts/2020-03-01-questions.md | perceptronn/deepdive | c7213b84048133c4ae8a5bd3c4aafb3966bc9b17 | [
"Apache-2.0"
] | null | null | null | _posts/2020-03-01-questions.md | perceptronn/deepdive | c7213b84048133c4ae8a5bd3c4aafb3966bc9b17 | [
"Apache-2.0"
] | 2 | 2021-09-28T00:57:43.000Z | 2022-02-26T06:45:47.000Z | _posts/2020-03-01-questions.md | perceptronn/deepdive | c7213b84048133c4ae8a5bd3c4aafb3966bc9b17 | [
"Apache-2.0"
] | null | null | null | # Questions...
1. Why do we need additional bias term in perceptron?
2. What is the role of activation function in a perceptron? Why do we need activation function at all?
| 43.25 | 102 | 0.763006 | eng_Latn | 0.999371 |
61f4595cce2906c2a313c5a44d7d8df9ccdc4a5d | 156 | md | Markdown | README.md | TitoFonda/final_penjadwalan-CI3 | b22b1a0fab94837a85fa2979d9b75f790e97a299 | [
"MIT"
] | null | null | null | README.md | TitoFonda/final_penjadwalan-CI3 | b22b1a0fab94837a85fa2979d9b75f790e97a299 | [
"MIT"
] | null | null | null | README.md | TitoFonda/final_penjadwalan-CI3 | b22b1a0fab94837a85fa2979d9b75f790e97a299 | [
"MIT"
] | null | null | null | # Scheduling-CI3
# Final-Project-CI3
# Scheduling-CI3
# Scheduling-CI3
# Scheduling-CI3
# Scheduling-CI3
# Scheduling-CI3
# Scheduling-CI3
# Scheduling-CI3
| 15.6 | 19 | 0.762821 | eng_Latn | 0.206779 |
61f5120ea863c48ee2bb81f24ff013ebe99dce89 | 232 | md | Markdown | Changelog.md | TelosLabs/wordpress_client | 969bc08cdef3d393eee2bfd3afc3f6db09b22dc0 | [
"MIT"
] | null | null | null | Changelog.md | TelosLabs/wordpress_client | 969bc08cdef3d393eee2bfd3afc3f6db09b22dc0 | [
"MIT"
] | null | null | null | Changelog.md | TelosLabs/wordpress_client | 969bc08cdef3d393eee2bfd3afc3f6db09b22dc0 | [
"MIT"
] | null | null | null | # 2.0.0
* Switch from WP REST API plugin to native WP API in Wordpress 4.6+.
# 1.0.1
* Bugfixes for latest Wordpress
* Security update changed the API responses completely for metadata and taxonomy.
# 1.0.0
* Initial release
| 17.846154 | 83 | 0.724138 | eng_Latn | 0.669531 |
61f58113b04e43c38205f6152ef683d2d90ddac6 | 734 | md | Markdown | data/bills/bernies/s2391.md | brobits/berniesbills | 0d3bc81837735590a9e31b5d1a2c4b4d29fde7d8 | [
"BSD-2-Clause"
] | 2 | 2016-02-25T07:46:09.000Z | 2016-02-25T19:27:42.000Z | dist/data/bills/bernies/s2391.md | brobits/berniesbills | 0d3bc81837735590a9e31b5d1a2c4b4d29fde7d8 | [
"BSD-2-Clause"
] | 12 | 2016-02-25T01:56:34.000Z | 2016-04-29T01:24:03.000Z | dist/data/bills/bernies/s2391.md | brobits/berniesbills | 0d3bc81837735590a9e31b5d1a2c4b4d29fde7d8 | [
"BSD-2-Clause"
] | null | null | null | "One of the great moral issues of our time is the global crisis of climate change. Let me be very clear about climate change. Climate change is not a Democratic issue or a progressive issue. It is not a Republican issue or a conservative issue. What it is, is an issue that has everything to do with physics. It is an issue of physics. What we know beyond a shadow of a doubt is that the debate is over, and that is that the vast majority of the scientists who have studied the issues are quite clear. What they tell us over and over again is that climate change is real, climate change is caused by human activity, and climate change is already causing devastating problems throughout our country and, in fact, throughout the world." | 734 | 734 | 0.79564 | eng_Latn | 0.999992 |
61f5837a0c1de5407f262ed5c9f73f9301dd17c8 | 989 | md | Markdown | docs/interfaces/_api_types_.audioattributes.md | RedFoxxo/onvif-rx | 0599acb52c98199a73e03ee5dbc22fb7cf9a8395 | [
"MIT"
] | 17 | 2019-03-18T07:18:00.000Z | 2021-09-02T12:26:47.000Z | docs/interfaces/_api_types_.audioattributes.md | RedFoxxo/onvif-rx | 0599acb52c98199a73e03ee5dbc22fb7cf9a8395 | [
"MIT"
] | 50 | 2019-01-14T17:08:50.000Z | 2022-02-17T02:55:18.000Z | docs/interfaces/_api_types_.audioattributes.md | RedFoxxo/onvif-rx | 0599acb52c98199a73e03ee5dbc22fb7cf9a8395 | [
"MIT"
] | 10 | 2019-06-24T05:27:04.000Z | 2021-12-26T21:36:25.000Z | [onvif-rx](../README.md) › ["api/types"](../modules/_api_types_.md) › [AudioAttributes](_api_types_.audioattributes.md)
# Interface: AudioAttributes
The bitrate in kbps.
## Hierarchy
* **AudioAttributes**
## Index
### Properties
* [Bitrate](_api_types_.audioattributes.md#optional-readonly-bitrate)
* [Encoding](_api_types_.audioattributes.md#readonly-encoding)
* [Samplerate](_api_types_.audioattributes.md#readonly-samplerate)
## Properties
### `Optional` `Readonly` Bitrate
• **Bitrate**? : *undefined | number*
*Defined in [api/types.ts:3807](https://github.com/patrickmichalina/onvif-rx/blob/3e9b152/src/api/types.ts#L3807)*
___
### `Readonly` Encoding
• **Encoding**: *string*
*Defined in [api/types.ts:3808](https://github.com/patrickmichalina/onvif-rx/blob/3e9b152/src/api/types.ts#L3808)*
___
### `Readonly` Samplerate
• **Samplerate**: *number*
*Defined in [api/types.ts:3809](https://github.com/patrickmichalina/onvif-rx/blob/3e9b152/src/api/types.ts#L3809)*
| 23.547619 | 119 | 0.731041 | eng_Latn | 0.150244 |
61f5bfb6b413c67c399a43caee613c2bc923c490 | 168 | md | Markdown | README.md | theUniC/string-calculator.hs | 541ea7fba403edae660a9aa07dac570f0ef9a788 | [
"MIT"
] | 1 | 2015-02-18T20:33:52.000Z | 2015-02-18T20:33:52.000Z | README.md | theUniC/string-calculator.hs | 541ea7fba403edae660a9aa07dac570f0ef9a788 | [
"MIT"
] | null | null | null | README.md | theUniC/string-calculator.hs | 541ea7fba403edae660a9aa07dac570f0ef9a788 | [
"MIT"
] | null | null | null | # String Calculator kata
The String Calculator kata written in Haskell
## How to run it?
```bash
$ cabal sandbox init
$ cabal install --enable-tests
$ cabal test
``` | 15.272727 | 45 | 0.720238 | eng_Latn | 0.771592 |
61f5c7425f316f2f1fe9d1b7655f1c60781be038 | 1,157 | md | Markdown | README.md | szes/get_ali_phone_num | ef1d66148959ae5767a2f231257a0bfa68defeb9 | [
"MIT"
] | null | null | null | README.md | szes/get_ali_phone_num | ef1d66148959ae5767a2f231257a0bfa68defeb9 | [
"MIT"
] | null | null | null | README.md | szes/get_ali_phone_num | ef1d66148959ae5767a2f231257a0bfa68defeb9 | [
"MIT"
] | null | null | null | ##爬取阿里小号
---
**爬取阿里小号,将归属地及手机号保存到pg数据库的如下的表里面**
*表结构如下:*
DROP SEQUENCE IF EXISTS
"public"."ali_phone_id_seq";
CREATE SEQUENCE "public"."ali_phone_id_seq"
INCREMENT 1
MINVALUE 1
MAXVALUE 9223372036854775807
START 1
CACHE 1;
DROP TABLE IF EXISTS "public"."ali_phone_info";
CREATE TABLE "public"."ali_phone_info" (
"id" int4 DEFAULT nextval('ali_phone_id_seq'::regclass) NOT NULL,
"addr" varchar(64) COLLATE "default" NOT NULL,
"phone_num" varchar(32) COLLATE "default" NOT NULL
)
WITH (OIDS=FALSE)
;
COMMENT ON COLUMN "public"."ali_phone_info"."id" IS '手机号码信息ID';
COMMENT ON COLUMN "public"."ali_phone_info"."addr" IS '号码归属地';
COMMENT ON COLUMN "public"."ali_phone_info"."phone_num" IS '手机号码';
-- ----------------------------
-- Alter Sequences Owned By
-- ----------------------------
-- ----------------------------
-- Primary Key structure for table ali_phone_info
-- ----------------------------
ALTER TABLE "public"."ali_phone_info" ADD PRIMARY KEY ("id");
**环境信息**
* CentOS Linux release 7.3.1611
* python 2.7.5
* pip install request
* pip install sqlalchemy
* pip install psycopg2
* psql (9.2.18, server 9.5.0)
| 25.711111 | 68 | 0.639585 | yue_Hant | 0.639319 |
61f646aa13fa3af40e73f5126be95f35f8719998 | 778 | md | Markdown | README.md | quilicicf/vscode-markdown-spec-formatter | 467d8b20029aeb5d8b3b355c8c243683f4f43380 | [
"Apache-2.0"
] | null | null | null | README.md | quilicicf/vscode-markdown-spec-formatter | 467d8b20029aeb5d8b3b355c8c243683f4f43380 | [
"Apache-2.0"
] | 5 | 2020-04-27T17:13:56.000Z | 2022-01-27T13:15:56.000Z | README.md | quilicicf/vscode-markdown-spec-formatter | 467d8b20029aeb5d8b3b355c8c243683f4f43380 | [
"Apache-2.0"
] | null | null | null | # markdown-spec-formatter package
> A markdown formatter for [VSCode](https://code.visualstudio.com/) based on [@quilicicf/markdown-formatter](https://github.com/quilicicf/markdown-formatter)
<!-- TOC START min:2 max:4 -->
* [Install](#install)
* [More info](#more-info)
<!-- TOC END -->
## Install
Install this extension from the extension panel in VSCode.
Or use the command palette (`ctrl+p`) and paste the command: `ext install quilicicf.markdown-spec-formatter`.
## More info
This package registers [@quilicicf/markdown-formatter](https://github.com/quilicicf/markdown-formatter) as a formatter for markdown in VSCode.
For more information about the applied formatting, please refer to [markdown-formatter's doc](https://github.com/quilicicf/markdown-formatter).
| 33.826087 | 157 | 0.754499 | eng_Latn | 0.739983 |
61f64f8a942c4c02143e8f2ce837e919e5f78c13 | 2,636 | md | Markdown | ui/wallet/src/app/help/en/welcome.md | Blurt-Blockchain/steem | fbffd373cdb0f6192aa8806d07e8671e219c3767 | [
"MIT"
] | 2 | 2020-04-21T03:10:06.000Z | 2020-04-21T05:49:46.000Z | ui/wallet/src/app/help/en/welcome.md | Blurt-Blockchain/steem | fbffd373cdb0f6192aa8806d07e8671e219c3767 | [
"MIT"
] | 4 | 2020-04-22T05:14:18.000Z | 2020-04-22T07:59:20.000Z | ui/wallet/src/app/help/en/welcome.md | Blurt-Blockchain/steem | fbffd373cdb0f6192aa8806d07e8671e219c3767 | [
"MIT"
] | 2 | 2020-04-22T05:04:29.000Z | 2020-10-23T13:58:19.000Z | <span id="disable_router_nav_history_direction_check"></span>
## Welcome to Blurt!
---
Now that you have an account, here's how to get started.
---
### 1. Backup your password
Unlike centralized web services, **it is not possible to recover lost passwords on the Blurt blockchain**.
You are entirely responsible for saving your password, backing it up, and keeping it secure.
Never put your password into unverified third party websites as they may steal your account.
### 2. Really, backup your password!
Store an **offline copy** of your password somewhere safe in case of a hard drive failure or other calamity.
Consider using a flash drive, secure cloud storage, or simply print it on paper.
### 3. Some ground rules
1. It is free to post, comment, and vote on all content at <a target="_blank" href="https://blurt.blog">blurt.blog</a>.
2. Do not plagiarize, and be sure to cite sources, including copyrighted images.
### 3. Update your profile
You can update your profile on the Settings page.
This includes your display name, location, about information, and website.
### 4. Create your "Introduceyourself" Post
The tradition for new users is to create an "introduceyourself" post in order
to let the community get to know you. You can verify other social media
accounts (Twitter, Facebook, etc.) by sharing the link to your Blurt account
from those profiles.
### 5. Sign up for Blurt Chat
A lot of users mingle and chat in [Blurt Discord](https://discord.blurt.world). It is a
great place to meet people!
Ask questions in the [\#helpdesk](https://discord.blurt.world) channel.
### 6. Voting and Tokens
Posts and comments can accrue votes over a period of 7 days. Projected payouts
will fluctuate (up and down) and no payout is guaranteed. If a post receives
enough votes for a payout, it will be split between the author (at least 50%)
and voters ("curators").
BLURT, and Blurt Power (BP) are the two forms of digital
currency used by the Blurt Blockchain. More information
[here](https://blurt.world/faq.html#What_is_the_difference_between_BLURT__BLURT_Power__and_Blurt_Dollars).
##### Additional resources
- [FAQ](https://blurt.blog/faq.html) - Answers to commonly asked questions
- [Blurt Bluepaper](https://blurt.io/blurt-bluepaper.pdf) - Explanation of how the platform works
- [Blurt Whitepaper](https://blurt.io/blurt-whitepaper.pdf) - Technical details of the Blurt blockchain
- [Apps Built on Blurt](https://blurtrojects.com/) - Directory of apps, sites and tools built by Blurt community
- [Blurt Block Explorer](https://blocks.blurtwallet.com/) - Shows the raw Blurt blockchain data
| 40.553846 | 119 | 0.758346 | eng_Latn | 0.993485 |
61f65ed361916e991b3ec1973e3490a5a08ec111 | 737 | md | Markdown | docs/_kb/KHV032.md | wongearl/kube-hunter | 00eb0dfa87d8a1b2932f2fa0bfc67c57e7a4ed03 | [
"Apache-2.0"
] | 3,521 | 2018-08-15T15:43:57.000Z | 2022-03-31T07:17:39.000Z | docs/_kb/KHV032.md | wongearl/kube-hunter | 00eb0dfa87d8a1b2932f2fa0bfc67c57e7a4ed03 | [
"Apache-2.0"
] | 350 | 2018-08-16T16:13:12.000Z | 2022-03-22T16:22:36.000Z | docs/_kb/KHV032.md | wongearl/kube-hunter | 00eb0dfa87d8a1b2932f2fa0bfc67c57e7a4ed03 | [
"Apache-2.0"
] | 551 | 2018-08-15T15:53:27.000Z | 2022-03-30T02:53:40.000Z | ---
vid: KHV032
title: Etcd Remote Read Access Event
categories: [Access Risk]
---
# {{ page.vid }} - {{ page.title }}
## Issue description
Etcd (Kubernetes' Database) is accessible without authentication. This exposes the entire state of your Kubernetes cluster to the reader.
## Remediation
Ensure your etcd is accepting connections only from the Kubernetes API, using the `--trusted-ca-file` etcd flag. This is usually done by the installer, or cloud platform.
## References
- [etcd - Transport security model](https://etcd.io/docs/v3.4.0/op-guide/security/)
- [Operating etcd clusters for Kubernetes - Securing etcd clusters](https://kubernetes.io/docs/tasks/administer-cluster/configure-upgrade-etcd/#securing-etcd-clusters) | 36.85 | 171 | 0.757123 | eng_Latn | 0.812124 |
61f6c748e775bc892e4b5ac688f5cf36058249f6 | 957 | md | Markdown | PITFALLS.md | noorus/nil | af982787d4cb052ffd7414ed5aed16c4851b1403 | [
"MIT"
] | 9 | 2015-07-20T14:49:49.000Z | 2022-01-17T02:52:54.000Z | PITFALLS.md | noorus/nil | af982787d4cb052ffd7414ed5aed16c4851b1403 | [
"MIT"
] | 5 | 2015-03-30T02:26:24.000Z | 2021-07-26T21:57:20.000Z | PITFALLS.md | noorus/nil | af982787d4cb052ffd7414ed5aed16c4851b1403 | [
"MIT"
] | 3 | 2015-06-19T20:29:17.000Z | 2017-09-19T03:36:55.000Z | There are a few possible pitfalls to make note of when using Nil on Windows:
* The XInput API is not buffered or event-based, it is poll-only. If you don't update the system fast enough, you could miss entire button presses on the XBOX gamepads. At least 30 FPS is recommended.
* The RawInput API only supports up to five buttons per mouse. I don't have a mouse with more than five native buttons, so it is to be investigated how such could be supported in Nil.
* Using the background cooperation mode (that is, global input) could bring up warnings from antivirus software when using your program. This is because global keyboard input can be used to implement keyloggers. I suggest sticking to foreground cooperation mode only, unless you really need global input.
Note that these pitfalls are not unique to Nil in any way, but rather are due to limitations in the underlying APIs. All other input systems suffer from the same limitations and problems.
| 119.625 | 304 | 0.796238 | eng_Latn | 0.999858 |
61f6cc7cc9a7fb94f75f6dc40545972f01892319 | 378 | md | Markdown | docs/adr/20200107-testing-framework.md | pchudzik/edu-ddd | 2601cf0e2a35eb087ea0949d082fbc553ddda77b | [
"MIT"
] | null | null | null | docs/adr/20200107-testing-framework.md | pchudzik/edu-ddd | 2601cf0e2a35eb087ea0949d082fbc553ddda77b | [
"MIT"
] | 3 | 2021-12-14T21:38:45.000Z | 2022-01-04T16:36:23.000Z | docs/adr/20200107-testing-framework.md | pchudzik/edu-ddd | 2601cf0e2a35eb087ea0949d082fbc553ddda77b | [
"MIT"
] | null | null | null | # Which testing framework to use
* Status: accepted
## Context and Problem Statement
Which testing framework to use.
## Decision Drivers <!-- optional -->
* fast to write tests in
## Considered Options
* spock
* junit5
* testng
## Decision Outcome
Chosen option: spock, because:
* I like it
* tests are in groovy (duck typing)
* offers syntax to write behavioral tests
| 15.12 | 41 | 0.722222 | eng_Latn | 0.990472 |
61f709d7b989578df47b4ee0e2ad8556d0d59013 | 12,869 | md | Markdown | contents/byway/2074.md | melitele/byways | 973924f06d66a8020dd0ea58ce640dbd406f95bb | [
"CC-BY-3.0",
"MIT"
] | 4 | 2019-01-26T20:50:25.000Z | 2020-08-12T20:47:19.000Z | contents/byway/2074.md | melitele/byways | 973924f06d66a8020dd0ea58ce640dbd406f95bb | [
"CC-BY-3.0",
"MIT"
] | 3 | 2019-04-01T17:25:37.000Z | 2021-03-09T01:55:25.000Z | contents/byway/2074.md | melitele/byways | 973924f06d66a8020dd0ea58ce640dbd406f95bb | [
"CC-BY-3.0",
"MIT"
] | 7 | 2017-04-19T22:52:17.000Z | 2021-07-13T03:51:42.000Z | ---
template: byway.jade
id: "2074"
name: Louisiana Scenic Bayou Byway
distance: "521"
duration: Allow at least 2 days.
description: "The Louisiana Bayou Byway consists of 521 miles of state designated scenic byways in Southeast Louisiana. Visit the historic houses, museums, plantations, State Parks/Commemorative Areas, and fairs and festivals in the area."
contact: "**Lacombe Heritage Center** \r\n985-882-7218"
path:
- "mia|DxiioPlP_b@xIuSNy@`Z{t@r~B{{F`AyCt@aDx@oGxF_kAf@mGfAuFjBqEjB{C|E}ElGgFfCgCrNiPxA_C~Ps_@dAmDhAeGjEsZpA{Kj@kHTcA`AwBlBmBhAm@pAe@`BS`FWnC^jKS`EYvFqAfmGumBtEmBpJyGnE{BdIeBdU{CjKeCj\\gKnJqDzEwBhIkEt_DqhBt_DaiBhMkGxUgJlKgFxfBkcAfb@qU|u@gc@pKsGzrByjAnJiGfOuL`JwGvWiO|I{E"
- "}qrzDjl`nPaAnIOjC@rCfAtUj@`f@Njc@h@fFzChORxAdEli@nCbNv@dGDbBUnDiCdSY`Ep@b^^fFhAzG~@vCxB|D`ErFlApBvFfL`JbSnDtGfJhS~C`FzDlElEdDtElCpHrC~J`BdCXdEOnB]vD_AjHoDdKsHrNsHdBm@tCk@|Fm@fA?fa@`Fvh@`@bFA~GQxGBvCh@~KpDdAj@hGvGdBhAFr@G~@Nn@LbB?xCWhEiAtGcExRi@xFVnDl@~CdBfF`HtJjCjBrDD|D_BbDy@zAEn^wDxH}CjHeHnJwNtLgYvFoLF[bEgLrAqFfAcIAm@OWbAuGf@sHtAoGpDyGxCsEnBaChAeAvQ}LhK_G~DgCpOuNdLaJhKmHnGqCnMyC`Eg@nE?h^dDfDl@j_@|KdCh@pIr@v\\xEtCx@tTbIbMrDlGx@lEJpKWxAUhAc@v`@yWhGkDbHoChHkBtFgAjBSfR}@fGm@zCw@`Ag@vD{CxAcBrBsCpBgEdAoDZ_CGaDm@eEiJcZW{BAeCpAaLzFy`@nBgJfBaHzCqNrC_IdAgEbCkG`BsCfC{BrBeAn_@aL`[iNjBaApByAvA_BvHuLpAqC`AmDhAuF`@sCHyCUsHmCgd@i@oND_Fb@uDhCyMbBiGvD{F|AkBhP{OjCaEvGoMNc@NuB@aE`@mRx@cIzGgTxAmDp@_AfDiCpAw@|@WpACdC^rz@jRjALtBBlEWvBe@dBy@pTaO|^qUbAYhAArI~AfFr@t@@n@Md@W~HiGlBkAnEmAlJaBbG_Cvk@uXfRuLhBm@`JkBjc@oIjBe@xAm@fCgB~hAkbAtEkDtBs@nLgFlIkBp@[|CeC`GgClA]fMeBpAYp@[bCmBs^ul@~H_M|C{CpAgDn@yCt@sCrCwEjGcJjEsE`CqB`By@p@OxOwArEGhHRhTgCrCw@n@a@lAkAxC{E|@kBVs@^sE\\mA^w@lB{BdDyCnZ{KbMoDhB_A~CqC|JaDfC_@t@Y|@_AxCsEf@sAN_AGoAsCkMq@{DEaHPmCh@mCn@kAtFoEhBwCfDmHR{@T_CEaB_@}CqBqJq@yFI_ACuHSqCBaAd@gB|@yAZ}@|@uF`AsDxAeBnCgEl@a@lGoCzIuItAy@rCy@p@[rBkCv@s@hEmBrHeHbFuBbDw@fCyAfCmBvBaDfEyHtLq\\Vg@p@q@lb@iSdGuBfH_BpF_BvAm@n@}@`AaCxAuBvBsEhAcB|AiAdNeH`EsCnC_AfCmBx@gAt@_Cr@kAjLeNhE}F`GoDxC_B`DyANMBQnDyBnAa@tFmAbGqBzL_Gh@A~Cl@|AD|@_DpAyAZs@D_@MiLDmAJ_@xIaL^Qf@EhHp@`FLhIY~Dg@tKcC|@{A~C{GlFsF\\{A?yCJ}@hBmD|@aAfQuNtAw@`Bg@z@MxEIzGa@t@QxAeAbD{ApQsBpHWrJrAzBH|@KfC_B|HgGvCyA~Cs@zAeAv@gAdF{Kx@_AhAq@h@Of@Ct@R`CxDrAv@`ALn@_@zAuD`AeD\\_EXq@z@{@lEaDt@}AJgBKaENeB|FcR`@mBCk@IWk@sAi@i@aMgHcAw@w@mAUk@Sy@G{AF_BXmAfIaRdA{C`AmHdGum@d@wBhAaCfXi\\^[tAq@tAQhADpQbDbA@`BKxBs@rAy@`KyJlBiA`B]`LsA`BAzJ`@|AAfCk@rI_D~AgArKsNrCyCbMkJ`QsI~@k@dByAn@y@fAgBfA_DZyAXaDFeEG{Ab@_O\\mCtCoMhAyEx@sB`AsAhAgAzGgE|DyCbBuBtFeKlAkBdB}A~a@u\\vAmBd@aAxAsDdGyQpAaCnd@sf@xAmAtAs@jC_@z]S~MMbBQvu@{U`aA{i@fBm@xCc@pBAfCVlBf@bBx@`C`BlDrE~FfGn@f@jBp@dBXjLG`CLnQxBrARpAf@nEnCjDfAdDl@ZoOXmHV_Bp@gCdByCtAsAnBeApi@wStDkBhBmBtAiBvBqDt@aBpBwHV}EGeEV_C`@uAfEgKpAqE`AuElAiKbCmKxBmG~CuFrFoLxGoSp@gDHqAZyK`@aEG{AmJuTs@{@aAc@yLaDyH_F_EaBwE_EiJcKiBiA_]wMwB]mEGaBYq@e@}GyHqAaAyC_Bk@g@yDiI{@cAoDgCuEqAcPoDkAe@q@e@{O_TuImPsCmDuEaFeAq@u@Q{EGy@Mo@W{AmAiB{C_AcAkAa@qBGqAU}@a@mAgAeBkD_BmEcAuE{AuJW}@u@gAs@_@gHwAkAm@aDuDq@]iFgAy@g@_@q@i@aCeBsJuAgDyAaL??"
- "}qrzDjl`nPvEgCzW}Ofs@oa@lFcEbCeCtDkExEcIrDmJnBaHx@eEdAwI`^}oEv@gLh@aOtU}wC`E{h@fDq`@hDsc@r@qHbBqMxQi|BtS_lCPsFBsDSwMgF}l@xEg@p`@{FdI_ADxBZnDhD`OlB`Pr@rEx@jD`A`D|J|WrCdOh@dCtAxExIpTfP~^nDfI~FzLxB|CbInJjLvKrH~HxPnR`GbIlIrItCdCpHrFhJnHrGdGvBzAdE`CxAn@rKbDxClAzL~DlYdHnAz@pDvDtBxApFxAnLtB|CRhNm@|Ea@jRwBrSgB|M_C|Q{GhS_JvGcDlCeBzDaDtFwFtEsFlKcQrBiExBgGfQox@lF{S^{CFyBWsF?gE^sDlEqYlBaNtBqRn@oK^kDt@gEbBsFpFiPlQyf@zCmHhFaLxEuKlBsBnp@iPzCYtBJzvCzh@b~@`Qr`@tG`{@hPrHlAfSqz@|@gEdVwbAdGcX~Gk^`G_]bByFhBgFlCuGtAeEbGkU|Ry{@|AcGjDkOza@efBz\\gvA|c@amBxOap@~M{k@lG}VbAsFnEcQzO{q@tEwQj^{|AbSwy@jAgHX{HG_Gm@gHoAmGaDcIiMi_@ec@_uAce@yxAoFiRaAuFu@yFi@aGWsG[iPf@cUNaMKmFc@{GBk@Vy@h@g@ZGl_@sAff@qArUG`gCRvGKx_BaFdXKhLM~CShBWjOeFdCe@pBI|A@zBPlX`DtFpBbK|GfYxRvAhArD|DpDnCdjGfcEvKtHhaAno@`FpDhKtGnG`DbEx@tFl@lGHdEWrv@aOtDa@jDKzCFnCVfDp@vVhJpDv@jAN~DJrCMvBYtEqA`VaIla@uNjbDsgA`KoDx[aNvGeD"
- "ee|wDdk{kP`Ae@|OwJxDkClAgBbAyF\\k@vJmE`K}GfLkHtAWv@m@pK}JtOwUzBiCxA_A|As@vHsBvfAkTxEsArGkCrUiMt@Kt@[~W_O~yBakA~_Bwu@rlEgyBpLsHtJuHbDuCjIyIbLoNdF_IpEiIbCmFbCaHtCiLzAmIzAcMl@eMDoHPoFdEmVfAsE|@oCtAqChDoFBW~B{C`CeE`AeCtGiSxEcL|z@_lBtvAk`Dz_@{y@pNwYzUoh@`c@k_Avg@_iAjP__@xA{Dt@iCfDqOdZuzAxOk|@~EwZnCoMzBgNhAuHhBuSh@sDp@yCnIkW~Cr@rFvBrAr@b@`@~AfCn@bD^lDj@pOVlKJhLSjXu@h_@C|Oh@fO`BdPj@hCxDtJtAtCxAbCbDhEnEfEdCjBtCzAbDlA|JpCtDXfDE|Km@jC_@|YiHxFiApa@gEvAE|BJlVdDzDdA`KzDzBdAji@r^hS|PnDjBlInH|E|ExQ~SrGlGdKfJlMfKzAdAhOhHxAf@vGvAjKdBhBFpC?`KG`FQfPyBvGsAnHgBl^uJnRsExCe@jHkB|HyAxAMnES~EFxEb@vEfAxKnDbHrDdFxEbCrCVp@vRv\\hAtAr@f@~BnAnBRhBLvH?nEq@dEaBfEgDvCaDzCmEbBeDrFoOv@eDb@yCpBqSh@aJr@se@c@gUFOB_@?mNEsEXkRO]b@aJd@aDv@mCt@aB`EaHhAwAlAmAnByA|As@jC}@hBYbCMhD?rK^pBPbL`CtLrDzG~BhB~@xJlGdNzGrGfCt^bPfZfKxNxDtQrBjBPrDBnEc@pP}D~Am@pByAjLuLlBkCdUqWzCaEdJaKtEcGjJeNzLcSlKyVhQwl@l@aC~CaQtA{EtBsFvBiEfC_E|EgGbD_D`DeC~G}DnHsCvFwAfG{@vHYfBCzFRtUpBxPx@jHj@lFFbIg@bSKtOYfk@mBbLRjAJvJZb]EvEMhTyB`Hy@fDs@hk@iQn[uMjB_AnDaCpAiAhCoCpSqZvCyG|FwRpBsFlPs\\`D_GfDaE|JyJnSiT|J{JjNiMjBsB|KiQbL{RnDuHlBmFrByH~@mE~CyTnCgMj@kETmEtAct@ZmFt@{HhBuLtDuOzK_\\`Ned@`GqSzGyW~Mss@rCsQjBqOzDuYhAoGnHc_@tA{FhI{WfTko@tAsDzN_\\`JgUtCyJtIa`@tHyZzDmJhAqErDyQlF}b@hB_ZXmIFeG"
- "gokwDflcjPjAtiFErIi@xG[tBgBtGeAjCiB~CkBfCsyBtmCoHvJiF|JsAlD{DbNaAdF{Y~gB{EhXgp@z`EsQfgAeAfFmAlDwAtCeChDyAtA{B~AaDxAs_GnwBoVlJsxEveB}o@jVkGjCohFxjDoJtGiJdHiKnGkdAnr@cYvQWBsJfGkP`Is@TuDj@yERsHBeLjC{CZkt@x@gCJoEr@{BPsA@eOe@y[sAiRg@gRd@mUXoUZu@dBsYd@sBccCi`@j@gL`@gWb@_hAfCinB~Ea[n@s^j@aDJcIz@QEyGjAgC|@mIfE_FxBe{@xb@eE~BcTbKAF}GlDqo@z[qI`D}QzIqM`HsHrDcNlI_b@|SkMdF}_@nRm`CdcAoD`AoALgADuJMeEl@iBr@iBdAwArAmH`KmB`BmAv@ikOlrGgMvF{e@lYeG|CkBf@}B^cLjAibEx`@m]rDaPxA{~BpUwCf@{ChAsBfAsCfCyBxCaAxBiArDY`BqAlL{Gnt@eAfGyAnE{AvCeA~AsCtCuA`A?Jw^nSsBlAyDxCuErFyL|RsB~EqArFsInf@u@rDaClGsAlCmBdCqEnEmOzLsBnCcBhDkQpe@mMj^u@`Bei@bxAkB~Gg@lCY`DuGveAeBjWYbIOzKa@~Fg@lDq@~Ck@`Bu@bB_DbEeA^e@Gc@]}EuJ{DgHsc@s|@{FsKuAgDgB_Jo@{B}@sBuAaBmIeIoCuCmAyBaDyGqAsBcBeBgl@ad@wDgCiEkDi\\s\\mMeNeB{AyAm@yMsC_SwD}BcAoB}A}AoBuBkFw@gD_CuQkFqZ}K}W}AkCkByBiN}MqDaE_p@wz@mAmBa@iAWs@c@cC[qCAqCFsArBkUhBoW|Dgc@lGix@ZsBr@mBfAeBhEmIlBuEx@eDzDeSd@mAJoAGcKFaTNqFGwMxE_VzIc[rBmD~IwLp@kAf@yAR{AH}DUaKw@cTMaB_@_Bm@yAy@qAsRySoDsGyAqDoAoDkBuHa@wCc@uG?yBFmCXgDVqCl@oDtBuHnEyM`AaEZgDb@aIvBu[fGe^RgBJyBGgCsGyd@OsACsAfC{Z|AqT`Dw_@j@yCZaAhEiH^_AxTaw@p@uDRyEmBw^GoC`Cmt@FmF_AwTyAye@QkDe@uBe@wAmXoi@y@kBy@aDWqCu@mVGaLKsBkD}W_@yBoeBi~E}Y}y@i@qB_@uBSsE~Aib@BuCwAmnBq@wi@{Bgj@cBqQw@_Gw@oD}AsEwAuCaCwDsCeDsGuG{C{DRe@YsCDqCf@cC|@eDV{BBkAWuPu@yDuAoEKmKeE?Yut@W_CyJab@eAkDgBoEes@w`BaAkCk@aEI_Cw@ukAy@eq@]_^LuOVkJzB_i@pFuj@FqCCsC_@gEqj@a~D_QoqAaJ_p@aKqw@GoDJwDzGaq@RcD`AygBiA{]cCco@yE}x@g@oU@gBPmC|@gF`Q{j@dE_MpKeXZmAr@gD`@sD`Eka@`Gyc@J_B@qEYiHCmBReEh@sD~_AuqDfW{t@hMc]tBsE`KsQpBqEhv@}_CxEwLfI_QnB{EvEwObJi^je@scBzBqGbIoSnDuLbAoC~K}]`Pgd@dIuQt@mDtDcT"
- "wevxDdjtgPtRw`@jE~C|@^|AXja@jB~AXpBr@hFxDlGzDhDfAtBNjp@uKrD_AlDkBrDeAlDOhNB~Dg@tEmBhE_AfG?nCXj\\xL|FfB`RdEvDtAxMlGfHlCpG`Bnc@zIlCt@~An@dEdCt@x@fD`EfBpCpAhCp@bBbEbOd@dCfDtMpAlDdOhU^RhA~Ax@lBj@xCf@vFl@jDtM|SzDxFhAfELx@F`C}@nNW`L@hArDdY~GwBzVuGhg@aMdTiHfBQhBJpAZpHlEhFxDfF~FfDdBjVzGjCRrDQjIpA|I`IlOzLlD~C`H|M`CpDfDrDnCbB|IlEdGjCpGzDdXlSpAtAxB~C|NzUlU``@lCvD`BhBlBvAnp@f`@dD~Dh@dA\\jARfBf@fZLbY]tOTj]iAzOgAzKAjCf@vIzBdORnFFlM`@bCtBdI\\dCM~^iAxEsCzImGdOu@~BGdAm@xCcClGe@lDB~EJnCFrGArB]lEyAbODpCrAtKO`G{@xHGtDL~KSt_@f@fFn@~ArBjBjMxGjMvKvDjC|FtGlFlHlA`JpAdEzBlElKnLTpAsBbKf@~BnA~@dLzDxDlBlBl@xBlAx@PpFoDlAk@vCYzM`ExJ~@|BMvBd@vDnBrFpAp@StByBpAk@hB]`BNbCh@tB~AfBxBzIfI~A~@dCE|EiFnAWjAXbAxAAlLPfAvAdBhA^lG^h@`A?jA_@z@oCjC}D`GiAxBqFhM_K|T}DfL_CrDc@d@u@rAwGbOkFlUoA`DkCnCqDjC_CjA_HdBkHh@uArAgDjKmFnMq@tJD~El@fDvAhCnDjEhF~EbCzAtEfA|Aj@dAbAl@t@fD~FvAlBvA`ApBb@|BRx@^lAzAdDzHxBlDzAxApNzJ`DhCvGpI~@~@lAr@xDbBdArA|A~Cb@bC?d@Gd@yB~Fa@`BiArIl@FtF?xXS|BFbC~@dAr@`AdAv@rAxC~GrNjY`AvCLvBCpBeChQ?nKd@fPPrChEnP|AfFhBjCnA~@bjBd~@nFzAtC\\bg@WfEfA`@VtBdCvAzDV~ABnBBzt@RbPJpa@\\|]K`GK`AeAdF{A`Ey@~EBxAPfCxEjZf@xH?fBg@fLOlMrt@DxGBlE\\|Cj@lu@lVhFvAt@XbPtIdAv@`@l@xKvRdDdF~EnJDr@bApDdOzb@pAvCpBbDhB|BfBdBpJbIdVbStk@rd@"
- "_jd{DvikhPzPhEdABpBe@rAeArL{KhDoCbz@cc@|DaCrCkCzCsD~N_SlB_Ch@e@dB_AbDs@vtAe@he@?x|@YtwKuAnCQx@S~CiApd@sXpCyAlBk@~BY~GYlCJhBl@pAdA`AxAj@nBzA`M^|Bx@bBd@j@f@d@dBx@~HfAvRhB`E@bGe@fFmAd_@{LjB{@wFyHsf@kl@uAaCeB}EaAwFsIup@U{EeAy^[gFe@wCvYyF~AOfB@zAPl}@pYxd@hQjDl@pDJdE]ti@cIlc@sIrIuBzOgDhj@yKzTsBjRuAjj@kAhTeBxE_@hZyIvBMxGVpGSb[sBtK_BtSeF`Me@hPSxDaAhAy@jJiOjXkf@lDkFrB{AjEsBfFkAfF{AdE_BlCkBzBmBtFyFzT}UpI{H~G{FnL_H|IiDvv@cUjHwAzGo@lx@iCxKIdJ`@jGy@bMkClEa@`NF`VM`PR|Ql@|MJfwAm@vBRlKdBfEJfAK`IaB~VeG|Ea@|GOxD?~GXfE?xFXfIPrRaAvFKvCEfBPlELvAErEkA~FaA~KMbDe@p`@gKdd@gEnb@yErOKhJYjLuAxQsIzDaAr]kD|Ce@lNmGrMuI~Ay@nTiGlEcBfIaErGaElGyEj@YtWoQtB{Oh@{AtAwBvXmZvBkD~DqNxCwF~AgD"
- "_nc{DpargP?fxDG|YBzxAC|MFjdCGrd@Jj}@t@nx@NnwAGpCg@`EgPf~@uAlHkDlOgBxIGf@"
- "uzlxDtyvbPrCpC|A~@`CX`ILdFlAt[tJrPvE`Q`Gdg@|Nt[rJba@zKzn@~Qnc@fNrKfChd@pMtWdIlVxGvUlHxRrF|KtDn}@zVnNnEvP|ErDf@vLl@dB^hi@tOyPv_@uCzHgArDsVbkAm`@bhBu@zBaB~DaFtHkC~EmAxDM~@I~C@lBzBvUNtFDrCEdEc@pSkC|c@wGvqAgT``E_@lKgBbYuG`oAkBjOcTlkAoQh`AaBvJm@rCu@rFP~Cb@lCZv@bA`BpGhGfL~Iz@v@z@nAb@~@ZxA^rDGrC[vAeBpGwGfUiJrZygAjvDY~@aF|KuDfLqJp\\gd@d|AiCfKmDzKqOvh@iAlDaRto@yKt^{T`w@eQ`d@gx@xpBaApBuHbRc[hw@WRiAxB{BzCiB~AyCjBaEnAcj@lJ[XoVxFqARuKl@kAPcCv@iCrAkC`BeMtGkCdBi@p@aB`CwB~DaG~VcEzP_C`MSdBy@`EqRbx@wF~VeBfGmP~r@aA`Fo[bsAs@~GStFFdFZ`EZdCxKhd@dHlX`A`F`@lDJnD?nOJdBSh[S~Eu@fFqArEgCbF{pA~cBiJxMyDlHuYvw@qClIgEpKwKnZ_LpYeEfN}AbJmF`t@gAhLmAnRq@zNc@bPe@|YSb[a@hK[bc@e@l]D`I]zPg@lI}@lG_]liEa@fL}A``Cu@jt@[lPsDdrAK~PJzIn@tK|Dh\\jDlWzHdn@`CvPdAzId@xJFvMoAjX{Cvg@OnFOvR\\rUFhO_@zKmBrkAUfGyAlLiC|KcJ~^SrA}@`I?lH~A`MtBnN|DtZpCdRzLh~@tAtIdEx\\hApHKbJu@pYm@bNO|HOXGfAq@|^B`AgCrm@iBry@e@~^D~G^lQRdXGzCSpCoBhOKnLBfI\\rGbD|X~@pS~BrpBTzHjEpd@fCpVp@zE|@lDx@jBjClE`BlBjGlEtHzH~HdHvCxB@@"
- "a_d{DxgqgPr@LvBdAxA~A\\t@vC`KxAvBJrA"
- "_nc{DpargPBsA|AsBrCoJf@kAjBqBdDiA"
- "use|D~|kgP|_@`FrtAhPjKjBbHzAjKzC`_A`^vgAla@rIjC~GbBdKdBhIjAtMz@tJPphDErMFtLn@z\\zD`z@tKzj@nGxKdAnkAdNvSnBpKxA`Kr@pJPdqAEhSDdK^n~@C|f@c@xV@"
- "a_d{DxgqgPxOKvPB"
- "o|b{DpgqgP~tA?nVZ`HBzbAIn\\e@hSHzDhA`BP~I?C}WGQJosA@_A`AcIvDqX|OE?oo@OwEQuDoDag@u@_E}AaG}AgD{BgCgCyAmHeFqPoMcCqBuAuA{t@suAsBmFi@_C_@uEKoUEq^@wpAAeEKsAa@aB{@eBw@_AqHgH{AmCeBsG}@{EiJeb@qAcJ{CsWiQ_tAwCsViFai@aBwJoGi\\{FeX_Zsz@aAiE{BoR?eCdAcIh@gLfDggAG_c@b@}`@C_A_Dk[C_CN{g@LgkAaY_rDo@mMeAgb@FeEVgE`Fsc@JuBO{ZJuBXqBdBmE~Nk[l@mB^qBJmFBs`@XgJtAkQJ{XQgDk@eDuAqNq@qF_@uAq@eAyDgCyGyC}EeA_@Mk@q@i@uAOw@qAm]a@qB{@gAkAm@ox@oVoCqAu@u@u@iBcIsa@wFkRyAeCw@u@iLmHkKgG{p@wa@kPuJ}DcBkQoF{BaAG_Kq@y`@LmDdGe^j@kC`@iA~IgQzAkBvJaKpA{A`@w@t@gDJmEUaDeBsH_@gCGyBFaBZ_DVgAvBcErC{DxEsFdAaAtEsBjPaGlAy@h@k@j@eA`@_BrCiRVgDFaFF_@Tg@bAk@_AsCmAaFc^qnAMoA@cB\\aBhBgDJeAKeAi@aA}RaUiAiCa@sBKqCFsA|Ema@pAiM\\sM?yH]}HuEix@cDul@k@{MgAmPwTcbEoEkbA}Ccu@WoIeAsk@WaKoCyd@qAoMwF_e@oAgHuAaG{Pio@{DeNcRsg@kAwDoA}EqAaHaBwN{Cw]}@iIW_FmAw_@CsCJaERuC`[ewBtAwIfBwI|BiJhAyD|DiLzs@yjBfC{H`Jc[pC{HnD{Hd[gh@dFgI`AqAn]mZ~@q@dHsD|^uKpBeA~CsCzBuDt@wBf@sBbCkPh@cChAcDdBcDxv@shAvHoLp\\ee@vF{Epc@}\\|PcN~CaDxR{UfFaIbGcKxAqCb@sCB}K@w^QqSOeCBgV~TA?iRmAuPJeClE_]j@{DfByNrBuN|g@tNjC^rCPpCEjBQxEy@lN{Dp[aIxLuBlEo@lFa@zHqAjBO~PQpDYvCs@hAc@jN{GpDyAlBa@pBKh@JlPz@zt@rDfKp@hEzAz_@lQxLfFnLxFj`@bQ|EfBdVfFpCJ`DMp^_IdKeCrCa@rCSvCCpEV`BXtEvApCnAzBrAfHdGrSdP`BvAbBlBhBfDh@~ApHpZ|@~ClArCfTx\\rBpCv@r@zBxAlEdB|SpHzLvEzDx@~VlBzO`BdRvAxWfClJvA|IzBlEzAzHxDhGxEjP|MfGrE|cA~b@rHbCtPfEfDn@pNhBfu@hFxj@~EtGXlZhCfHd@nAlPrC|X`BzRNtE?zCm@`QBhIz@rGxArNzGvWrDhMnO`k@zCzLxTtx@bTbx@nAfCpA~AvTxSbu@~q@tc@lb@r]v[zs@pq@jGvFjClC`BfCz_@d`AjQpb@|BtG\\|AhAzIlBbL`AlFd@vAhBhCbD~D|UdXbO`PrPnPfYdZhJvEr`@|PlBtAdKzL`EpFxQxSx~@hhAlKnLhAfApk@b`@xIrFbh@|M|Br@jAr@t@p@zLjShOzShKhNzDrEbGhGlFdFfGfJzGrK`EnE`GjI|@z@x@f@h[bQvH~F~ElElAfCZpA`BbST~ExEbe@\\pEv@`G?`@"
websites:
- url: "http://www.tangi-cvb.org"
name: Tangipahoa Parish in Louisiana
- url: "http://www.VisitBatonRouge.com"
name: Visit Baton Rouge
- url: "http://visitlivingstonparish.com/"
name: Visit Livingston Parish in Louisiana
designations:
- Louisiana State Scenic Byway
states:
- LA
ll:
- -91.8033289996336
- 30.977679999732175
bounds:
- - -91.8033289996336
- 29.795740000400656
- - -89.74250800024402
- 30.999790000408154
---
<p>Louisiana's Bayou Byway route explores the back roads in 13 parishes and covers some 500 miles through Ascension, Assumption, Iberville, Pointe Coupee, East Baton Rouge, West Baton Rouge, East and West Feliciana, Livingston, St. Helena, St. Tammany, Tangipahoa, and Washington. Ease out of the fast lane, exit the interstate, enjoy the culture, history, and beauty of this homeland. Let the locals treat you like homefolks as you wander the Louisiana Scenic Bayou Byway through a state known as host to the world and home to the adventurous.</p>
<p>Along the byway are numerous properties listed in the National Register historic districts, and three National historic landmarks. You will be able to experience
Louisiana's ties to Europe, Africa, the Caribbean Basin,
Canada, Asia, and the rest of these United States. Many early 1700's records can be found in historic courthouses along the Louisiana Scenic Bayou Byway.</p> | 257.38 | 2,273 | 0.810242 | yue_Hant | 0.620935 |
61f7fb9e014a59479c4d1a07143f950e8722f0c9 | 942 | md | Markdown | content/posts/2008-04-11-photographers-strike-back-in-uk.md | yshz/blog | eff12db2e9df0fbc10fad54f92890511fa07b1f0 | [
"MIT"
] | null | null | null | content/posts/2008-04-11-photographers-strike-back-in-uk.md | yshz/blog | eff12db2e9df0fbc10fad54f92890511fa07b1f0 | [
"MIT"
] | null | null | null | content/posts/2008-04-11-photographers-strike-back-in-uk.md | yshz/blog | eff12db2e9df0fbc10fad54f92890511fa07b1f0 | [
"MIT"
] | null | null | null | ---
type: link
title: Photographers strike back in UK
linkurl: http://www.pressgazette.co.uk/node/40875
author: Matthias Kretschmann
date: 2008-04-11 14:16:51+00:00
tags:
- photography
---
[](../media/londonpolice.jpg)
Remember [the campaign of the London police](/london-police-afraid-of-photographers/) calling all people to regard photographers as potential terrorists?
Now the [PressGazette reports](http://www.pressgazette.co.uk/node/40875) that
> Labour MP [Member of Parliament] Austin Mitchell is planning to take a delegation of photographers to the Home Office to protest about the growing number of cases in which police officers and others try to stop professional and amateur photographers taking pictures in public places.
You can read the whole story here:
[Photographers lobby parliament over police curbs](http://www.pressgazette.co.uk/node/40875)
| 39.25 | 285 | 0.785563 | eng_Latn | 0.945405 |
61f86475c194ff636fefef28416fbab6c5742f80 | 702 | md | Markdown | images/singularity/README.md | obedmr/oneapi-containers | 6cca6475fe164304c2e5f27f8801af961af5e894 | [
"BSD-3-Clause"
] | null | null | null | images/singularity/README.md | obedmr/oneapi-containers | 6cca6475fe164304c2e5f27f8801af961af5e894 | [
"BSD-3-Clause"
] | null | null | null | images/singularity/README.md | obedmr/oneapi-containers | 6cca6475fe164304c2e5f27f8801af961af5e894 | [
"BSD-3-Clause"
] | 1 | 2022-01-25T16:49:45.000Z | 2022-01-25T16:49:45.000Z | # Intel<sup>®</sup> oneAPI Singularity Containers
Singularity containers are supported by Intel oneAPI.
# Note
**Note:** CentOS* 8 - based containers are deprecated and no longer supported. [Details](https://www.centos.org/centos-linux-eol/). <br />
You may still find the CentOS Dockerfile, but it is no longer being updated.
Here is an example to run the Mandelbrot sample in a Singularity container:
```sh
cd basekit-devel-ubuntu18.04
simg="intel-oneapi-basekit-devel-ubuntu18.04.simg"
sudo singularity build "$simg" Singularity
singularity shell "$simg"
oneapi-cli
# (1) Create a project
# (1) cpp
# select the Mandelbrot sample
cd mandelbrot
mkdir build
cd build
cmake ..
make
make run
```
| 23.4 | 138 | 0.754986 | eng_Latn | 0.949014 |
61f8daf9fe36023db2340d213ba0a1294358b6b8 | 1,070 | md | Markdown | README.md | raytools/Ray2Mod_NewProject | 352ba0c2ec6c3f6ba12069b0a3bc1f378a055361 | [
"Info-ZIP"
] | 1 | 2020-06-07T12:14:32.000Z | 2020-06-07T12:14:32.000Z | README.md | raytools/Ray2Mod_NewProject | 352ba0c2ec6c3f6ba12069b0a3bc1f378a055361 | [
"Info-ZIP"
] | null | null | null | README.md | raytools/Ray2Mod_NewProject | 352ba0c2ec6c3f6ba12069b0a3bc1f378a055361 | [
"Info-ZIP"
] | null | null | null | # Ray2Mod_NewProject
Boilerplate project to quickly get started with Ray2Mod
You can download this repository as a template under the releases page, and add it to Visual Studio by copying the ZIP file to the folder:
``C:\Users\[User Name]\Documents\Visual Studio 20xx\Templates\Project Templates``
To begin modding Rayman 2, you have to add Ray2Mod.dll as a reference to this project:
1. Right click References in the Solution Explorer
2. Click Browse and locate Ray2Mod.dll
3. Press OK to add the reference
Now you can run your mod by building this project and dragging the exported DLL onto ModRunner.exe
To automatically start the ModRunner when clicking Start, configure the following:
1. Click Project -> <YourProject> Properties...
2. Open the Debug configuration on the left
3. As start action, select "Start external program" and Browse for ModRunner.exe
4. Under start options, set the command line arguments to <YourProject>.dll (for example Ray2Mod_NewProject.dll)
Now the ModRunner will start and inject your mod whenever you click start.
| 50.952381 | 138 | 0.78785 | eng_Latn | 0.991995 |
61f91b2f7762a83f3f5c08c4e5a52dc7b1371bff | 225 | md | Markdown | _watches/M20200808_054613_TLP_7.md | Meteoros-Floripa/meteoros.floripa.br | 7d296fb8d630a4e5fec9ab1a3fb6050420fc0dad | [
"MIT"
] | 5 | 2020-01-22T17:44:06.000Z | 2020-01-26T17:57:58.000Z | _watches/M20200808_054613_TLP_7.md | Meteoros-Floripa/site | 764cf471d85a6b498873610e4f3b30efd1fd9fae | [
"MIT"
] | null | null | null | _watches/M20200808_054613_TLP_7.md | Meteoros-Floripa/site | 764cf471d85a6b498873610e4f3b30efd1fd9fae | [
"MIT"
] | 2 | 2020-05-19T17:06:27.000Z | 2020-09-04T00:00:43.000Z | ---
layout: watch
title: TLP7 - 08/08/2020 - M20200808_054613_TLP_7T.jpg
date: 2020-08-08 05:46:13
permalink: /2020/08/08/watch/M20200808_054613_TLP_7
capture: TLP7/2020/202008/20200807/M20200808_054613_TLP_7T.jpg
---
| 28.125 | 63 | 0.76 | eng_Latn | 0.03883 |
61f9e850492c25089a38616f82975b572fcea3b5 | 658 | md | Markdown | module/docs/enums/internal_.PaneType.md | Norviah/animal-crossing | 106aea6f6334f616c37464f75362f0ca0ae8818e | [
"MIT"
] | 30 | 2020-06-12T14:35:01.000Z | 2021-10-01T23:30:23.000Z | module/docs/enums/internal_.PaneType.md | Norviah/animal-crossing | 106aea6f6334f616c37464f75362f0ca0ae8818e | [
"MIT"
] | 8 | 2021-03-19T20:06:45.000Z | 2022-01-19T23:35:27.000Z | module/docs/enums/internal_.PaneType.md | Norviah/animal-crossing | 106aea6f6334f616c37464f75362f0ca0ae8818e | [
"MIT"
] | 7 | 2020-07-12T12:08:00.000Z | 2022-01-07T08:38:58.000Z | [animal-crossing](../README.md) / [Exports](../modules.md) / [<internal\>](../modules/internal_.md) / PaneType
# Enumeration: PaneType
[<internal>](../modules/internal_.md).PaneType
## Table of contents
### Enumeration members
- [Glass](internal_.PaneType.md#glass)
- [Screen](internal_.PaneType.md#screen)
## Enumeration members
### Glass
• **Glass** = `"Glass"`
#### Defined in
[types/Item.ts:281](https://github.com/Norviah/animal-crossing/blob/d6e407b/module/types/Item.ts#L281)
___
### Screen
• **Screen** = `"Screen"`
#### Defined in
[types/Item.ts:282](https://github.com/Norviah/animal-crossing/blob/d6e407b/module/types/Item.ts#L282)
| 19.939394 | 110 | 0.68693 | yue_Hant | 0.502189 |
61fa16e4a7c35506229ae09763cd9e8bdc3d646c | 742 | md | Markdown | README.md | throwawaymicrosoft/firebase | eeed0e179728d53c01a0bf3cb2328b46145c9e0d | [
"MIT"
] | null | null | null | README.md | throwawaymicrosoft/firebase | eeed0e179728d53c01a0bf3cb2328b46145c9e0d | [
"MIT"
] | null | null | null | README.md | throwawaymicrosoft/firebase | eeed0e179728d53c01a0bf3cb2328b46145c9e0d | [
"MIT"
] | null | null | null | Сборник сниппетов Firebase
==========================
```{
"rules": {
".read": false,
".write": false,
// Право на запись имеют только те, кто содержится в users
"posts": {
".read": true,
"$post": {
".read": "root.child('posts').child($post).child('visible').val() === true",
".write": "root.child('users').child(auth.uid).child('roles').child('administrator').val() === true"
},
},
"settings": {
".read": true,
"$uid": {
".write": "root.child('users').child(auth.uid).child('roles').child('administrator').val() === true"
},
},
"logs": {
".write": true,
},
"users": {
".read": false,
".write": false,
}
}
}```
| 24.733333 | 109 | 0.463612 | eng_Latn | 0.18634 |
61fa68ccdbe2bc0e721524e9acf66bc57e0373a1 | 1,809 | md | Markdown | docs/mfc/exceptions-examining-exception-contents.md | jmittert/cpp-docs | cea5a8ee2b4764b2bac4afe5d386362ffd64e55a | [
"CC-BY-4.0",
"MIT"
] | 1 | 2019-02-10T10:38:37.000Z | 2019-02-10T10:38:37.000Z | docs/mfc/exceptions-examining-exception-contents.md | jmittert/cpp-docs | cea5a8ee2b4764b2bac4afe5d386362ffd64e55a | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/mfc/exceptions-examining-exception-contents.md | jmittert/cpp-docs | cea5a8ee2b4764b2bac4afe5d386362ffd64e55a | [
"CC-BY-4.0",
"MIT"
] | 1 | 2020-06-14T03:42:31.000Z | 2020-06-14T03:42:31.000Z | ---
title: "Exceptions: Examining Exception Contents"
ms.date: "11/04/2016"
helpviewer_keywords: ["exception handling [MFC], MFC", "try-catch exception handling [MFC], MFC function exceptions", "catch blocks, MFC function exceptions", "CException class [MFC], class exceptions", "try-catch exception handling [MFC], exception contents", "throwing exceptions [MFC], exception contents"]
ms.assetid: dfda4782-b969-4f60-b867-cc204ea7f33a
---
# Exceptions: Examining Exception Contents
Although a **catch** block's argument can be of almost any data type, the MFC functions throw exceptions of types derived from the class `CException`. To catch an exception thrown by an MFC function, then, you write a **catch** block whose argument is a pointer to a `CException` object (or an object derived from `CException`, such as `CMemoryException`). Depending on the exact type of the exception, you can examine the data members of the exception object to gather information about the specific cause of the exception.
For example, the `CFileException` type has the `m_cause` data member, which contains an enumerated type that specifies the cause of the file exception. Some examples of the possible return values are `CFileException::fileNotFound` and `CFileException::readOnly`.
The following example shows how to examine the contents of a `CFileException`. Other exception types can be examined similarly.
[!code-cpp[NVC_MFCExceptions#13](../mfc/codesnippet/cpp/exceptions-examining-exception-contents_1.cpp)]
For more information, see [Exceptions: Freeing Objects in Exceptions](../mfc/exceptions-freeing-objects-in-exceptions.md) and [Exceptions: Catching and Deleting Exceptions](../mfc/exceptions-catching-and-deleting-exceptions.md).
## See Also
[Exception Handling](../mfc/exception-handling-in-mfc.md)
| 78.652174 | 524 | 0.785517 | eng_Latn | 0.94966 |
61fabe1455bd765748dc63d0de58bf9c44e4ae3e | 2,216 | md | Markdown | _posts/2018-11-11-Download-building-arguments.md | Kirsten-Krick/Kirsten-Krick | 58994392de08fb245c4163dd2e5566de8dd45a7a | [
"MIT"
] | null | null | null | _posts/2018-11-11-Download-building-arguments.md | Kirsten-Krick/Kirsten-Krick | 58994392de08fb245c4163dd2e5566de8dd45a7a | [
"MIT"
] | null | null | null | _posts/2018-11-11-Download-building-arguments.md | Kirsten-Krick/Kirsten-Krick | 58994392de08fb245c4163dd2e5566de8dd45a7a | [
"MIT"
] | null | null | null | ---
layout: post
comments: true
categories: Other
---
## Download Building arguments book
I think I was put off it by the I settled back in the chair, I'm with you," he told them. You'll be praised on America's Most Wanted and maybe even hugged on The detective shrugged! He still had to get one more endorsement But now it seemed possible, noisy stream he had heard singing through his sleep all his nights in Woodedge, lightly dozing, I already had the vague outline of a plan. Mohn's _Norges Klima_ (reprinted from "I'm not truly a teller, "God grant thee that thou seekest, because even during an episode of The power building arguments the Archmage of Roke was in many respects that of a king. Not just witchcraft? The heavy gate opened soundlessly. However, and assume sole and total authority for the duration of such emergency situations as he saw fit to declare. 020LeGuin20-20Tales20From20Earthsea. " easily imagine he is looking building arguments ten mystical entry points to building arguments sky of another world. after various Building arguments, but before it had An abandoned bicycle on its side, I think Jay probably wants to talk about things you wouldn't be interested in," Colman said to Anita, how much was that. up there. "That's discrimination. The chill that shivered through 172 the house trailer next door. in the building arguments a shape as lacking in detail as that of the robed and hooded Catcher. he wouldn't. hardened old snow, her face expressing formless doubts. Forgive me? He studied building arguments from a safe trust committed to him! He skids and nearly falls on a cascade of loose for what it was. They are said to of its own, as did Celestina and Grace, and asked, as he had taught it to her? As often as either man entered the other's a combination doctor's-assayer's office, he's barely able to be poor where _do they come from, but She didn't actually expect to meet Preston Maddoc. " She bit her building arguments lip, 1880, eating and drinking, Dr, swing back building arguments traditional, he tried green. (117) I denied the debt, and knew he was fortunate, on the ground of a text in the Gospel of Matthew near Cape Lisburn on the American side. stubborn lid. | 246.222222 | 2,124 | 0.788809 | eng_Latn | 0.999886 |
61fae0f419426f033f9da3c0abb678a71a31fb61 | 54 | md | Markdown | ChangeLog.md | awave1/lambda-lifting | 7d0590b0d42c37c27bbe837e4078103b8dc831ab | [
"BSD-3-Clause"
] | null | null | null | ChangeLog.md | awave1/lambda-lifting | 7d0590b0d42c37c27bbe837e4078103b8dc831ab | [
"BSD-3-Clause"
] | null | null | null | ChangeLog.md | awave1/lambda-lifting | 7d0590b0d42c37c27bbe837e4078103b8dc831ab | [
"BSD-3-Clause"
] | null | null | null | # Changelog for lambda-lifting
## Unreleased changes
| 13.5 | 30 | 0.777778 | eng_Latn | 0.997715 |
61fae94f7dc6cb328bce172d405a433aa68239a1 | 1,295 | md | Markdown | includes/migration-guide/retargeting/asp/httpruntimeappdomainapppath-throws-nullreferenceexception.md | Athosone/docs.fr-fr | 83c2fd74def907edf5da4a31fee2d08133851d2f | [
"CC-BY-4.0",
"MIT"
] | null | null | null | includes/migration-guide/retargeting/asp/httpruntimeappdomainapppath-throws-nullreferenceexception.md | Athosone/docs.fr-fr | 83c2fd74def907edf5da4a31fee2d08133851d2f | [
"CC-BY-4.0",
"MIT"
] | null | null | null | includes/migration-guide/retargeting/asp/httpruntimeappdomainapppath-throws-nullreferenceexception.md | Athosone/docs.fr-fr | 83c2fd74def907edf5da4a31fee2d08133851d2f | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
ms.openlocfilehash: e7154919d6a09a04e650d5546feb2ae6c6cc912f
ms.sourcegitcommit: 5b6d778ebb269ee6684fb57ad69a8c28b06235b9
ms.translationtype: HT
ms.contentlocale: fr-FR
ms.lasthandoff: 04/08/2019
ms.locfileid: "59234937"
---
### <a name="httpruntimeappdomainapppath-throws-a-nullreferenceexception"></a>HttpRuntime.AppDomainAppPath lève une exception NullReferenceException
| | |
|---|---|
|Détails|Dans .NET Framework 4.6.2, le runtime lève un <code>T:System.NullReferenceException</code> quand vous récupérez une valeur <code>P:System.Web.HttpRuntime.AppDomainAppPath</code> qui inclut des caractères null. Dans .NET Framework 4.6.1 et les versions antérieures, le runtime lève un <code>T:System.ArgumentNullException</code>.|
|Suggestion|Vous pouvez effectuer l’une ou l’autre des opérations suivantes pour répondre à ce changement :<ul><li>Gérer <code>T:System.NullReferenceException</code> si votre application s’exécute sur .NET Framework 4.6.2.</li><li>Effectuer une mise à niveau vers .NET Framework 4.7, qui restaure le comportement précédent et lève un <code>T:System.ArgumentNullException</code>.</li></ul>|
|Portée|Microsoft Edge|
|Version|4.6.2|
|Type|Reciblage|
|API affectées|<ul><li><xref:System.Web.HttpRuntime.AppDomainAppPath?displayProperty=nameWithType></li></ul>|
| 68.157895 | 389 | 0.789961 | fra_Latn | 0.507739 |
61faed9e719a86db05cafd9c3bdd1eeefdd6f4f1 | 1,397 | md | Markdown | data/issues/ZF-3962.md | zendframework/zf3-web | 5852ab5bfd47285e6b46f9e7b13250629b3e372e | [
"BSD-3-Clause"
] | 40 | 2016-06-23T17:52:49.000Z | 2021-03-27T20:02:40.000Z | data/issues/ZF-3962.md | zendframework/zf3-web | 5852ab5bfd47285e6b46f9e7b13250629b3e372e | [
"BSD-3-Clause"
] | 80 | 2016-06-24T13:39:11.000Z | 2019-08-08T06:37:19.000Z | data/issues/ZF-3962.md | zendframework/zf3-web | 5852ab5bfd47285e6b46f9e7b13250629b3e372e | [
"BSD-3-Clause"
] | 52 | 2016-06-24T22:21:49.000Z | 2022-02-24T18:14:03.000Z | ---
layout: issue
title: "parseOnLoad: true should not be forced if using UseDeclarative"
id: ZF-3962
---
ZF-3962: parseOnLoad: true should not be forced if using UseDeclarative
-----------------------------------------------------------------------
Issue Type: Bug Created: 2008-08-15T12:32:32.000+0000 Last Updated: 2008-09-02T10:39:40.000+0000 Status: Resolved Fix version(s): - 1.6.0 (02/Sep/08)
Reporter: Bernd Matzner (bmatzner) Assignee: Matthew Weier O'Phinney (matthew) Tags: - Zend\_Dojo
Related issues:
Attachments:
### Description
In Dojo/View/Helper/Dojo/Container.php, the djconfig parameter parseOnLoad is forced to true when using declarative style. There are cases, however, when manually running the parser after custom code may be desired. So we should be able to override the default parseonload: true using setDjConfigOption.
### Comments
Posted by Matthew Weier O'Phinney (matthew) on 2008-08-22T15:01:31.000+0000
The only way to do this is to check if parseOnLoad has been previously set; otherwise, the default should be to turn it on.
Scheduling for RC3.
Posted by Matthew Weier O'Phinney (matthew) on 2008-08-24T13:54:08.000+0000
Fixed in trunk and 1.6 release branch. Set the djConfig option "parseOnLoad" to "false" to disable.
Posted by Wil Sinclair (wil) on 2008-09-02T10:39:40.000+0000
Updating for the 1.6.0 release.
| 27.94 | 303 | 0.710093 | eng_Latn | 0.94127 |
61fb0b1c3bffd9fefe5fec7ae1aa62243d925fbd | 746 | md | Markdown | new-docs/puppeteer.protocol.systeminfo.getprocessinforesponse.md | hugodes/puppeteer | 15d1906e7c1b6ded6cfd802db38a2f906094c6fb | [
"Apache-2.0"
] | 1 | 2020-08-08T18:56:31.000Z | 2020-08-08T18:56:31.000Z | new-docs/puppeteer.protocol.systeminfo.getprocessinforesponse.md | hugodes/puppeteer | 15d1906e7c1b6ded6cfd802db38a2f906094c6fb | [
"Apache-2.0"
] | 1 | 2020-10-06T18:53:37.000Z | 2020-10-06T18:53:37.000Z | new-docs/puppeteer.protocol.systeminfo.getprocessinforesponse.md | hugodes/puppeteer | 15d1906e7c1b6ded6cfd802db38a2f906094c6fb | [
"Apache-2.0"
] | 1 | 2021-12-10T08:15:34.000Z | 2021-12-10T08:15:34.000Z | <!-- Do not edit this file. It is automatically generated by API Documenter. -->
[Home](./index.md) > [puppeteer](./puppeteer.md) > [Protocol](./puppeteer.protocol.md) > [SystemInfo](./puppeteer.protocol.systeminfo.md) > [GetProcessInfoResponse](./puppeteer.protocol.systeminfo.getprocessinforesponse.md)
## Protocol.SystemInfo.GetProcessInfoResponse interface
<b>Signature:</b>
```typescript
export interface GetProcessInfoResponse
```
## Properties
| Property | Type | Description |
| --- | --- | --- |
| [processInfo](./puppeteer.protocol.systeminfo.getprocessinforesponse.processinfo.md) | [ProcessInfo](./puppeteer.protocol.systeminfo.processinfo.md)<!-- -->\[\] | An array of process info blocks. |
| 39.263158 | 236 | 0.705094 | kor_Hang | 0.218396 |
61fda85d573f70e89bf99a149b13faa83f883af2 | 12,643 | md | Markdown | pipelines/skylab/scATAC/README.md | Ldcabansay/warp | e7257da275df5cb961d18d394fd62cf3be50a6ca | [
"BSD-3-Clause"
] | null | null | null | pipelines/skylab/scATAC/README.md | Ldcabansay/warp | e7257da275df5cb961d18d394fd62cf3be50a6ca | [
"BSD-3-Clause"
] | null | null | null | pipelines/skylab/scATAC/README.md | Ldcabansay/warp | e7257da275df5cb961d18d394fd62cf3be50a6ca | [
"BSD-3-Clause"
] | null | null | null | | Pipeline Version | Date Updated | Documentation Author | Questions or Feedback |
| :----: | :---: | :----: | :--------------: |
| [scATAC 1.1.0 ](scATAC.wdl) | August 24th 2020 | [Elizabeth Kiernan](mailto:[email protected]) | Please file GitHub issues in skylab or contact [Kylee Degatano](mailto:[email protected]) |
- [Overview](#overview)
- [Introduction](#introduction)
* [Quick Start Table](#quick-start-table)
- [Set-up](#set-up)
* [Workflow Installation and Requirements](#workflow-installation-and-requirements)
* [Pipeline Inputs](#pipeline-inputs)
* [Input File Preparation](#input-file-preparation)
+ [R1 and R2 FASTQ Preparation](#r1-and-r2-fastq-preparation)
+ [Input_reference Preparation](#input_reference-preparation)
- [Workflow Tasks and Tools](#workflow-tasks-and-tools)
* [Task Summary](#task-summary)
+ [AlignPairedEnd](#alignpairedend)
+ [SnapPre](#snappre)
- [Filtering Parameters](#filtering-parameters)
- [Snap QC Metrics](#snap-qc-metrics)
+ [SnapCellByBin](#snapcellbybin)
+ [MakeCompliantBAM](#makecompliantbam)
+ [BreakoutSnap](#breakoutsnap)
- [Outputs](#outputs)
- [Running on Terra](#running-on-terra)
- [Versioning](#versioning)
- [Pipeline Improvements](#pipeline-improvements)
# Overview
<img src="documentation/scATAC_diagram.png" width="500">
# Introduction
The scATAC Pipeline was developed by the Broad DSP Pipelines team to process single nucleus ATAC-seq datasets. The pipeline is based on the [SnapATAC pipeline](https://github.com/r3fang/SnapATAC) described by [Fang et al. (2019)](https://www.biorxiv.org/content/10.1101/615179v2.full). Overall, the pipeline uses the python module [SnapTools](https://github.com/r3fang/SnapTools) to align and process paired reads in the form of FASTQ files. It produces an hdf5-structured Snap file that includes a cell-by-bin count matrix. In addition to the Snap file, the final outputs include a GA4GH compliant aligned BAM and QC metrics.
| Want to use the scATAC Pipeline for your publication? |
|---|
| *Check out the [scATAC Publication Methods](scatac.methods.md) to get started!* |
## Quick Start Table
| Pipeline Features | Description | Source |
| --- |--- | --- |
| Assay Type | Single nucleus ATAC-seq | [Preprint here ](https://www.biorxiv.org/content/biorxiv/early/2019/05/13/615179.full.pdf)
| Overall Workflow | Generates Snap file with cell x bin matrix | Code available from [GitHub](scATAC.wdl) |
| Workflow Language | WDL 1.0 | [openWDL](https://github.com/openwdl/wdl) |
| Aligner | BWA | [Li H. and Durbin R., 2009](https://pubmed.ncbi.nlm.nih.gov/19451168/) |
| Data Input File Format | File format in which sequencing data is provided | Paired-end FASTQs with cell barcodes appended to read names (read barcode demultiplexing section [here](https://github.com/r3fang/SnapATAC/wiki/FAQs#whatissnap)) |
| Data Output File Format | File formats in which scATAC output is provided | [BAM](http://samtools.github.io/hts-specs/), [Snap](https://github.com/r3fang/SnapATAC/wiki/FAQs#whatissnap) |
# Set-up
## Workflow Installation and Requirements
The [scATAC workflow](scATAC.wdl) is written in the Workflow Description Language WDL and can be downloaded by cloning the GitHub [WARP repository](https://github.com/broadinstitute/warp/). The workflow can be deployed using [Cromwell](https://cromwell.readthedocs.io/en/stable/), a GA4GH compliant, flexible workflow management system that supports multiple computing platforms. For the latest workflow version and release notes, please see the scATAC [changelog](scATAC.changelog.md).
## Pipeline Inputs
The pipeline inputs are detailed in the table below. You can test the workflow by using the [https://github.com/broadinstitute/warp/tree/master/pipelines/skylab/scATAC/example_inputs/human_example.json](human_example.json) example configuration file.
| Input name | Input type | Description |
| --- | --- | --- |
| input_fastq1 | File | FASTQ file of the first reads (R1) |
| input_fastq2 | File | FASTQ file of the second reads (R2) |
| input_reference | File | Reference bundle that is generated with bwa-mk-index-wdl found [here](https://github.com/HumanCellAtlas/skylab/blob/master/library/accessory_workflows/build_bwa_reference/bwa-mk-index.wdl)|
| genome_name | String | Name of the genomic reference (name that precedes the “.tar” in the input_reference) |
| output_bam | String | Name for the output BAM |
| bin_size_list | String | List of bin sizes used to generate cell-by-bin matrices; default is 10000 bp |
## Input File Preparation
### R1 and R2 FASTQ Preparation
The scATAC workflow requires paired reads in the form FASTQ files with the cell barcodes appended to the readnames. A description of the barcode demultiplexing can be found on the SnapATAC documentation (see barcode demultiplexing section [here](https://github.com/r3fang/SnapATAC/wiki/FAQs#CEMBA_snap)). The full cell barcode must form the first part of the read name (for both R1 and R2 files) and be separated from the rest of the line by a colon. You can find an example python code to perform demultiplexing in the [SnapTools documentation here](https://github.com/r3fang/SnapTools/blob/master/snaptools/dex_fastq.py). The codeblock below demonstrates the correct format.
```
@CAGTTGCACGTATAGAACAAGGATAGGATAAC:7001113:915:HJ535BCX2:1:1106:1139:1926 1:N:0:0
ACCCTCCGTGTGCCAGGAGATACCATGAATATGCCATAGAACCTGTCTCT
+
DDDDDIIIIIIIIIIIIIIHHIIIIIIIIIIIIIIIIIIIIIIIIIIIII
```
### Input_reference Preparation
The input_reference is a BWA compatible reference bundle in TAR file format. You can create this BWA reference using the accessory workflow [here](https://github.com/HumanCellAtlas/skylab/blob/master/library/accessory_workflows/build_bwa_reference/bwa-mk-index.wdl).
# Workflow Tasks and Tools
The [scATAC workflow](scATAC.wdl) is divided into multiple tasks which are described in the table below. The table also links to the Docker Image for each task and to the documentation or code for the relevant software tool parameters.
| Task | Task Description | Tool Docker Image | Parameter Descriptions or Code |
|--- | --- | --- | --- |
| AlignPairedEnd | Align the modified FASTQ files to the genome | [snaptools:0.0.1](https://github.com/broadinstitute/warp/blob/master/dockers/skylab/snaptools/Dockerfile) | [SnapTools documentation](https://github.com/r3fang/SnapTools) |
| SnapPre | Initial generation of snap file | [snaptools:0.0.1](https://github.com/broadinstitute/warp/blob/master/dockers/skylab/snaptools/Dockerfile) | [SnapTools documentation](https://github.com/r3fang/SnapTools) |
| SnapCellByBin | Binning of data by genomic bins | [snaptools:0.0.1](https://github.com/broadinstitute/warp/blob/master/dockers/skylab/snaptools/Dockerfile) | [SnapTools documentation](https://github.com/r3fang/SnapTools) |
| MakeCompliantBAM | Generation of a GA4GH compliant BAM | [snaptools:0.0.1](https://github.com/broadinstitute/warp/blob/master/dockers/skylab/snaptools/Dockerfile) | [Code](https://github.com/broadinstitute/warp/blob/master/dockers/skylab/snaptools/makeCompliantBAM.py) |
| BreakoutSnap | Extraction of tables from snap file into text format (for testing and user availability) | [snap-breakout:0.0.1](https://github.com/HumanCellAtlas/skylab/tree/master/docker/snap-breakout) | [Code](https://github.com/broadinstitute/warp/tree/master/dockers/skylab/snap-breakout/breakoutSnap.py) |
## Task Summary
### AlignPairedEnd
The AlignPairedEnd task takes the barcode demultiplexed FASTQ files and aligns reads to the genome using the BWA aligner. It uses the SnapTools min_cov parameter to set the minimum number of barcodes a fragment requires to be included in the final output. This parameter is set to 0. The final task output is an aligned BAM file.
### SnapPre
The SnapPre task uses SnapTools to perform preprocessing and filtering on the aligned BAM. The task outputs are a Snap file and QC metrics. The table below details the filtering parameters for the task.
#### Filtering Parameters
| Parameter | Description | Value |
| --- | --- | --- |
| --min-mapq | Fragments with mappability less than value will be filtered | 30 |
| --min-flen | Fragments of length shorter than min_flen will be filtered | 0 |
| --max-flen | Fragments of length bigger than min_flen will be filtered | 1000 |
| --keep-chrm | Boolean variable indicates whether to keep reads mapped to chrM | TRUE |
| --keep-single | Boolean variable indicates whether to keep single-end reads | TRUE |
| --keep-secondary | Boolean variable indicates whether to keep secondary alignments | FALSE
| --max-num | Max number of barcodes to be stored. Based on the coverage, top max_barcode barcodes are selected and stored | 1000000 |
| --min-cov | Fragments with less than min-cov number of barcodes will be filtered | 100 |
### SnapCellByBin
The SnapCellByBin task uses the Snap file to create cell-by-bin count matrices in which a “1” represents a bin with an accessible region of the genome and a “0” represents an inaccessible region. The bin_size_list sets the bin size to 10,000 bp by default but can be changed by specifying the value in the inputs to the workflow.
### MakeCompliantBAM
The MakeCompliantBAM task uses a [custom python script (here)](https://github.com/broadinstitute/warp/blob/master/dockers/skylab/snaptools/makeCompliantBAM.py) to make a GA4GH compliant BAM by moving the cellular barcodes in the read names to the CB tag.
### BreakoutSnap
The BreakoutSnap task extracts data from the Snap file and exports it to individual CSVs. These CSV outputs are listed in the table in the Outputs section below. The code is available [here](https://github.com/broadinstitute/warp/tree/master/dockers/skylab/snap-breakout/breakoutSnap.py).
# Outputs
The main outputs of the scATAC workflow are the Snap file, Snap QC metrics, and the GA4GH compliant BAM file. All files with the prefix “breakout” are CSV files containing individual pieces of data from the Snap. The sessions for the Snap file are described in the [SnapTools documentation](https://github.com/r3fang/SnapTools). Additionally, you can read detailed information on the [Snap file fields for each session](https://github.com/r3fang/SnapTools/blob/master/docs/snap_format.docx) (select "View Raw").
| Output File Name | Description |
| --- | --- |
| output_snap_qc | Quality control file corresponding to the snap file |
| output_snap | Output snap file (in hdf5 container format) |
| output_aligned_bam | Output BAM file, compliant with GA4GH standards |
| breakout_barcodes | Text file containing the FM ('Fragment session') barcodeLen and barcodePos fields |
| breakout_fragments | Text file containing the FM ('Fragments session') fragChrom, fragLen, and fragStart fields |
| breakout_binCoordinates | Text file with the AM session ('Cell x bin accessibility' matrix) binChrom and binStart fields |
| breakout_binCounts | Text file with the AM session ('Cell x bin accessibility' matrix) idx, idy, and count fields |
| breakout_barcodesSection | Text file with the data from the BD session ('Barcode session' table) |
#### Snap QC Metrics
The following table details the metrics available in the output_snap_qc file.
| QC Metric | Abbreviation |
| --- | --- |
| Total number of unique barcodes | No abbreviation |
| Total number of fragments | TN |
| Total number of uniquely mapped | UM |
| Total number of single ends | SE |
| Total number of secondary alignments | SA |
| Total number of paired ends | PE |
| Total number of proper paired | PP |
| Total number of proper frag len | PL |
| Total number of usable fragments | US |
| Total number of unique fragments | UQ |
| Total number of chrM fragments | CM |
# Running on Terra
[Terra](https://app.terra.bio/) is a public, cloud-based platform for biomedical research. If you would like to try the scATAC workflow (previously named "snap-atac") in Terra, you can import the most recent version from the [Broad Methods Repository](https://portal.firecloud.org/?return=terra#methods/snap-atac-v1_0/snap-atac-v1_0/2) (Google login required). Additionally, there is a public [SnapATAC_Pipeline workspace](https://app.terra.bio/#workspaces/brain-initiative-bcdc/SnapATAC_Pipeline) preloaded with the scATAC workflow and downsampled data.
# Versioning
All scATAC workflow releases are documented in the [scATAC changelog](scATAC.changelog.md).
# Pipeline Improvements
Please help us make our tools better by contacting [Kylee Degatano](mailto:[email protected]) for pipeline-related suggestions or questions.
| 74.810651 | 677 | 0.761528 | eng_Latn | 0.882533 |
61fdddedb372b8a09e0b3d802d515c0323141952 | 6,190 | md | Markdown | translations/es-ES/content/packages/quickstart.md | feii33/docs | 5d2e0d713b818d1a8e79c89747cab627298a084c | [
"CC-BY-4.0",
"MIT"
] | 4 | 2021-03-18T10:00:20.000Z | 2022-03-27T11:12:05.000Z | translations/es-ES/content/packages/quickstart.md | sanchezkearrah/docs | a5bdf26b5cd81faff04fe39ed4cbfc1ca7731264 | [
"CC-BY-4.0",
"MIT"
] | 12 | 2021-02-09T07:04:34.000Z | 2021-07-23T06:18:27.000Z | translations/es-ES/content/packages/quickstart.md | sanchezkearrah/docs | a5bdf26b5cd81faff04fe39ed4cbfc1ca7731264 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-01-16T23:09:44.000Z | 2021-01-16T23:09:44.000Z | ---
title: Guía de inciio rápido para GitHub Packages
intro: 'Publica en el {% data variables.product.prodname_registry %} en 5 minutos o menos con {% data variables.product.prodname_actions %}.'
allowTitleToDifferFromFilename: verdadero
versions:
free-pro-team: '*'
enterprise-server: '>=2.22'
---
### Introducción
Solo necesitas un repositorio existente de {% data variables.product.prodname_dotcom %} para publicar un paquete en el {% data variables.product.prodname_registry %}. En esta guía, crearás un flujo de trabajo de {% data variables.product.prodname_actions %} para probar tu código y luego lo publicarás en el {% data variables.product.prodname_registry %}. Ten la libertad de crear un repositorio nuevo para esta guía de incio rápido. Puedes utilizarlo para probar este flujo de trabajo de {% data variables.product.prodname_actions %}, y los subsecuentes.
### Publicar tu paquete
1. Crea un repositorio nuevo en {% data variables.product.prodname_dotcom %}, agregando el `.gitignore` para Node. Crea un repositorio si te gustaría borrar este paquete posteriormente. Los paquetes públicos no podrán borrarse. Para obtener más información, consulta la sección "[Crear un nuevo repositorio](/github/creating-cloning-and-archiving-repositories/creating-a-new-repository)."
2. Clona el repositorio en tu máquina local.
{% raw %}
```shell
$ git clone https://github.com/<em>YOUR-USERNAME</em>/<em>YOUR-REPOSITORY</em>.git
$ cd <em>YOUR-REPOSITORY</em>
```
{% endraw %}
3. Crea un archivo `index.js` y agrega una alerta básica que diga "Hello world!"
{% raw %}
```javascript{:copy}
alert("Hello, World!");
```
{% endraw %}
4. Inicializa un paquete de npm. En el asistente de inicialización de paquetes, ingresa tu paquete con el nombre _`@YOUR-USERNAME/YOUR-REPOSITORY`_, y configura el script de prueba como `exit 0` si no tienes ninguna prueba. Confirma tus cambios y súbelos a
{% data variables.product.prodname_dotcom %}.
{% raw %}
```shell
$ npm init
...
package name: <em>@YOUR-USERNAME/YOUR-REPOSITORY</em>
...
test command: <em>exit 0</em>
...
$ npm install
$ git add index.js package.json package-lock.json
$ git commit -m "initialize npm package"
$ git push
```
{% endraw %}
5. Desde tu repositorio en {% data variables.product.prodname_dotcom %}, crea un archivo nuevo en el directorio `.github/workflows` que se llame `release-package.yml`. Para obtener más información, consulta "[Crear nuevos archivos](/github/managing-files-in-a-repository/creating-new-files)."
6. Copia el siguiente contenido de YAML en el archivo `release-package.yml`.
{% raw %}
```yaml{:copy}
name: Node.js Package
on:
release:
types: [created]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/setup-node@v1
with:
node-version: 12
- run: npm ci
- run: npm test
publish-gpr:
needs: build
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/setup-node@v1
with:
node-version: 12
registry-url: https://npm.pkg.github.com/
- run: npm ci
- run: npm publish
env:
NODE_AUTH_TOKEN: ${{secrets.GITHUB_TOKEN}}
```
{% endraw %}
7. Desplázate hasta el final de la página y selecciona **Crear una rama nueva para esta confirmación e iniciar una solicitud de cambios**. Después, para crear una solicitud de cambios, da clic en **Proponer un archivo nuevo**.
8. **Fusiona** la solicitud de cambios.
9. Navega a la pestaña de **Código** y crea un lanzamiento nuevo para probar el flujo de trabajo. Para obtener más información, consulta la sección "[Gestionar los lanzamientos en un repositorio](/github/administering-a-repository/managing-releases-in-a-repository#creating-a-release)".
El crear un lanzamiento nuevo en tu repositorio activa el flujo de trabajo para compilar y probar tu código. Si las pruebas pasan, entonces el paquete se publicará en {% data variables.product.prodname_registry %}.
### Visualizar tu paquete publicado
Los paquetes se publican a nivel del repositorio. Puedes ver todos los paquetes en un repositorio y buscar un paquete específico.
{% data reusables.repositories.navigate-to-repo %}
{% data reusables.package_registry.packages-from-code-tab %}
{% data reusables.package_registry.navigate-to-packages %}
### Instalar un paquete publicado
Ahora que publicaste el paquete, querrás utilizarlo como una dependencia en tus proyectos. Para obtener más información, consulta la sección "[Configurar npm para utilizarlo con el {% data variables.product.prodname_registry %}](/packages/guides/configuring-npm-for-use-with-github-packages#installing-a-package)".
### Pasos siguientes
El flujo básico que acabas de agregar se ejecuta en cualquier momento que se cree un lanzamiento nuevo en tu repositorio. Pero, este es solo el principio de lo que puedes hacer con {% data variables.product.prodname_registry %}. Puedes publicar tu paquete en varios registros con un solo flujo de trabajo, activar el flujo de trabajo para que se ejecute en eventos diferentes tales como una solicitud de cambios fusionada, administrar contenedores, y más.
El combinar el {% data variables.product.prodname_registry %} y las {% data variables.product.prodname_actions %} puede ayudarte a automatizar casi cualquier aspecto de tu proceso de desarrollo de aplicaciones. ¿Listo para comenzar? Aquí hay algunos recursos útiles para llevar a cabo los siguientes pasos con el {% data variables.product.prodname_registry %} y las {% data variables.product.prodname_actions %}:
- "[Aprende sobre el {% data variables.product.prodname_registry %}](/packages/learn-github-packages)" para un tutorial más a fondo de GitHub Packages
- "[Aprende sobre las {% data variables.product.prodname_actions %}](/actions/learn-github-actions)" para un tutorial más a fondo de GitHub Actions
- "[Guías](/packages/guides)" para los casos de uso y ejemplos específicos
| 55.765766 | 555 | 0.719871 | spa_Latn | 0.950368 |
61fde984a405a14e81a60c4ee713ce1cbc3998c8 | 1,669 | md | Markdown | README.md | ggdream/suit | 570141a84b5c158ac789f2378b5c3eacad0fa077 | [
"MIT"
] | 1 | 2021-08-02T04:29:18.000Z | 2021-08-02T04:29:18.000Z | README.md | ggdream/suit | 570141a84b5c158ac789f2378b5c3eacad0fa077 | [
"MIT"
] | null | null | null | README.md | ggdream/suit | 570141a84b5c158ac789f2378b5c3eacad0fa077 | [
"MIT"
] | null | null | null | # suit
<div align="center">
<a href="https://space.bilibili.com/264346349">
<img src="https://img.shields.io/badge/bilibili-魔咔啦咔-blueviolet" alt="😭" />
</a>
<a href="https://github.com/ggdream/suit">
<img src="https://img.shields.io/badge/GitHub-魔咔啦咔-ff69b4" alt="😭" />
</a>
</div>
<div align="center">
<img src="https://img.shields.io/badge/Platforms-Windows,Linux,MacOS,Android,IOS,Web-009688" alt="😭" />
<img src="https://img.shields.io/badge/Mode-all-3949ab" alt="😭" />
</div>
## A library of platform adaptation scheme based on the ViewPort (vw and vh) and Relative Position.
---
> The package contains two types of adapters. One is a global adapter based on viewport(vw, vh), and the other is a percentage adapter based on Relative Positioning(rw, rh).
---
## 1. The global adapter based on viewport(vw, vh)
### 1) Initialize the adapter
~~~dart
/// [Adapter] is the global adapter, for Viewport(vw, vh).
Adapter.initialize(context);
~~~
- Cannot initialize under root widget
- An application only needs to initialize the global adapter once
- You can write it in any lifecycle hook, but for desktop application adaptation, it's better to write it in the `build` method
### 2) Import the package and write size by `vw, vh` anywhere
~~~dart
Container(
width: 25.vw,
height: 75.vh
)
~~~
- Method 1: 36.vw, 52.0.vw, 100.vh
- Method 2: Adapter.setVW(25.0), Adapter.setVH(30.0)
- Method 3: Adapter.vw * 32, Adapter.vh * 75
---
## 2. The percentage adapter based on Relative Position(rw, rh).
## 3. Complete example
[Click me to lookup the whole example code](https://pub.dev/packages/suit/example)
| 29.803571 | 173 | 0.681246 | eng_Latn | 0.764355 |
61fdf5088f4772bea8a91d65f6c492b8847b699a | 2,805 | md | Markdown | 3.0/cookbook/graph-java-driver-graph-example-vertex.md | divyanshidm/docs | 846c7bb343f016dbbfb86f62e53f1a0be330767c | [
"Apache-2.0"
] | null | null | null | 3.0/cookbook/graph-java-driver-graph-example-vertex.md | divyanshidm/docs | 846c7bb343f016dbbfb86f62e53f1a0be330767c | [
"Apache-2.0"
] | null | null | null | 3.0/cookbook/graph-java-driver-graph-example-vertex.md | divyanshidm/docs | 846c7bb343f016dbbfb86f62e53f1a0be330767c | [
"Apache-2.0"
] | null | null | null | ---
layout: default
description: Note
---
# How to use an example vertex with the java driver
## Problem
**Note**: Arango 2.2 and the corresponding javaDriver is needed.
You want to retrieve information out of a graph using an object<T> as example vertex, and the object contains primitive datatypes such as 'int' or 'char'. E. g. you have a graph "myGraph" with vertices that are objects of the following class...
```java
public class MyObject {
private String name;
private int age;
public MyObject(String name, int age) {
this.name = name;
this.age = age;
}
/*
* + getter and setter
*/
}
```
... and edges of:
```java
public class MyEdge {
private String desc;
public MyEdge(String desc) {
this.desc = desc;
}
/*
* + getter and setter
*/
}
```
To retrieve all edges from vertices with a given name (e. g. "Bob") and arbitrary age the method
```java
arangoDriver.graphGetEdgeCursorByExample("myGraph", MyEdge.class, myVertexExample)
```
can not be used, because primitive datatypes (like 'int') can not be set to null (all attributes that are not null will be used as filter criteria).
## Solution
There is a solution, but it's not that satisfying, because you need to know the attribute names of the stored documents representing the objects. If you know the attribute names, which are used to filter vertices it's possible to create a map as vertex example:
```java
Map<String, Object> exampleVertex = new HashMap<String, Object>();
exampleVertex.put("name", "Bob");
CursorEntity<MyEdge> cursor = arangoDriver.graphGetEdgeCursorByExample("myGraph", MyEdge.class, exampleVertex);
```
Vice versa it's no problem to retrieve all edges of vertices that have been set to a certain age (e. g. 42):
```java
MyObject myVertexExample = new MyObject(null, 42);
CursorEntity<MyEdge> cursor = arangoDriver.graphGetEdgeCursorByExample("myGraph", MyEdge.class, myVertexExample)
```
## Other resources
More documentation about the ArangoDB java driver is available:
- [Arango DB Java in ten minutes](https://www.arangodb.com/tutorials/tutorial-java/){:target="_blank"}
- [java driver Readme at Github](https://github.com/arangodb/arangodb-java-driver){:target="_blank"}, [especially the graph example](https://github.com/arangodb/arangodb-java-driver/blob/master/src/test/java/com/arangodb/example/GraphQueryExample.java){:target="_blank"}
- [Example source in java](https://github.com/arangodb/arangodb-java-driver/tree/master/src/test/java/com/arangodb/example){:target="_blank"}
- [Unittests](https://github.com/arangodb/arangodb-java-driver/tree/master/src/test/java/com/arangodb){:target="_blank"}
**Author**: [gschwab](https://github.com/gschwab){:target="_blank"}
**Tags**: #java #graph #driver
| 33 | 271 | 0.722638 | eng_Latn | 0.913607 |
61fe648fc969697eee14e0f09ae36c0dcbe62791 | 47 | md | Markdown | README.md | ohinitoffa/ift6266-project | 4cf658530c0300ce68706a4647a4666616f1d039 | [
"MIT"
] | null | null | null | README.md | ohinitoffa/ift6266-project | 4cf658530c0300ce68706a4647a4666616f1d039 | [
"MIT"
] | null | null | null | README.md | ohinitoffa/ift6266-project | 4cf658530c0300ce68706a4647a4666616f1d039 | [
"MIT"
] | null | null | null | # ift6266-project
Conditional Image Generation
| 15.666667 | 28 | 0.851064 | eng_Latn | 0.67891 |
61ff5b99cb5cb21697136b91e7475b9fd1fe2a58 | 2,655 | md | Markdown | _docs/v0.36.6/api/crash-reporter.md | viswanathamsantosh/electron.atom.io | 7a42b1e62abf1b67bf4cf05dc5250c6758062c8a | [
"MIT"
] | null | null | null | _docs/v0.36.6/api/crash-reporter.md | viswanathamsantosh/electron.atom.io | 7a42b1e62abf1b67bf4cf05dc5250c6758062c8a | [
"MIT"
] | 3 | 2021-05-20T21:57:56.000Z | 2022-02-26T10:02:59.000Z | _docs/v0.36.6/api/crash-reporter.md | viswanathamsantosh/electron.atom.io | 7a42b1e62abf1b67bf4cf05dc5250c6758062c8a | [
"MIT"
] | null | null | null | ---
version: v0.36.6
category: API
title: 'Crash Reporter'
source_url: 'https://github.com/atom/electron/blob/master/docs/api/crash-reporter.md'
---
# crashReporter
The `crash-reporter` module enables sending your app's crash reports.
The following is an example of automatically submitting a crash report to a
remote server:
```javascript
const crashReporter = require('electron').crashReporter;
crashReporter.start({
productName: 'YourName',
companyName: 'YourCompany',
submitURL: 'https://your-domain.com/url-to-submit',
autoSubmit: true
});
```
## Methods
The `crash-reporter` module has the following methods:
### `crashReporter.start(options)`
`options` Object, properties:
* `productName` String, default: Electron.
* `companyName` String (**required**)
* `submitURL` String, (**required**)
* URL that crash reports will be sent to as POST.
* `autoSubmit` Boolean, default: `true`.
* Send the crash report without user interaction.
* `ignoreSystemCrashHandler` Boolean, default: `false`.
* `extra` Object
* An object you can define that will be sent along with the report.
* Only string properties are sent correctly.
* Nested objects are not supported.
You are required to call this method before using other `crashReporter`
APIs.
**Note:** On OS X, Electron uses a new `crashpad` client, which is different
from `breakpad` on Windows and Linux. To enable the crash collection feature,
you are required to call the `crashReporter.start` API to initialize `crashpad`
in the main process and in each renderer process from which you wish to collect
crash reports.
### `crashReporter.getLastCrashReport()`
Returns the date and ID of the last crash report. If no crash reports have been
sent or the crash reporter has not been started, `null` is returned.
### `crashReporter.getUploadedReports()`
Returns all uploaded crash reports. Each report contains the date and uploaded
ID.
## crash-reporter Payload
The crash reporter will send the following data to the `submitURL` as `POST`:
* `ver` String - The version of Electron.
* `platform` String - e.g. 'win32'.
* `process_type` String - e.g. 'renderer'.
* `guid` String - e.g. '5e1286fc-da97-479e-918b-6bfb0c3d1c72'
* `_version` String - The version in `package.json`.
* `_productName` String - The product name in the `crashReporter` `options`
object.
* `prod` String - Name of the underlying product. In this case Electron.
* `_companyName` String - The company name in the `crashReporter` `options`
object.
* `upload_file_minidump` File - The crash report as file.
* All level one properties of the `extra` object in the `crashReporter`.
`options` object
| 32.378049 | 85 | 0.74275 | eng_Latn | 0.976717 |
61ffe9a35a81031ceefa2808b42525dcd9b5e09d | 104 | md | Markdown | posts/This-Is-A-Test.md | MineRobber9000/asblog | ac3fb92aa443a93ab37a438677beae1e98eab858 | [
"MIT"
] | 2 | 2019-04-25T18:34:59.000Z | 2019-05-08T05:18:40.000Z | posts/This-Is-A-Test.md | MineRobber9000/asblog | ac3fb92aa443a93ab37a438677beae1e98eab858 | [
"MIT"
] | null | null | null | posts/This-Is-A-Test.md | MineRobber9000/asblog | ac3fb92aa443a93ab37a438677beae1e98eab858 | [
"MIT"
] | null | null | null | ---
title: This is a Test
pubdate: 2019-04-07
published: true
---
This is a test. Nothing to see here.
| 13 | 36 | 0.682692 | eng_Latn | 0.999611 |
61ffee5ad897a95e6c747947e0cd13014d6eb5b2 | 757 | md | Markdown | src/newsletters/2013/35/service-based-membership-providers-for-asp-net.md | dotnetweekly/dnw-gatsby | b12818bfe6fd92aeb576b3f79c75c2e9ac251cdb | [
"MIT"
] | 1 | 2022-02-03T19:26:09.000Z | 2022-02-03T19:26:09.000Z | src/newsletters/2013/35/service-based-membership-providers-for-asp-net.md | dotnetweekly/dnw-gatsby | b12818bfe6fd92aeb576b3f79c75c2e9ac251cdb | [
"MIT"
] | 13 | 2020-08-19T06:27:35.000Z | 2022-02-26T17:45:11.000Z | src/newsletters/2013/35/service-based-membership-providers-for-asp-net.md | dotnetweekly/dnw-gatsby | b12818bfe6fd92aeb576b3f79c75c2e9ac251cdb | [
"MIT"
] | null | null | null | ---
_id: 5a88e1aebd6dca0d5f0d2c42
title: "Service Based Membership Providers for ASP.NET"
url: 'http://www.codeproject.com/Articles/633962/Service-Based-Membership-Providers-for-ASP-NET'
category: articles
slug: 'service-based-membership-providers-for-asp-net'
user_id: 5a83ce59d6eb0005c4ecda2c
username: 'bill-s'
createdOn: '2013-08-24T07:39:56.000Z'
tags: []
---
The ASP.NET framework has a built-in, simple to use declarative mechanism for membership managements, provided that the applications have implemented and configured a set of provider classes derived from a set of predefined base classes. They are called custom providers in the sequel. The present article describes a set of custom providers for a service based membership management system
| 54.071429 | 390 | 0.808454 | eng_Latn | 0.982106 |
1101ba47c5e00b63be9802591b5fa6a54c37ad9d | 1,505 | md | Markdown | README.md | hulufei/RustBook | 257d2ab885b1eccb548bf1680f8ee7354cf43fe6 | [
"Apache-2.0"
] | 1 | 2022-02-20T10:03:45.000Z | 2022-02-20T10:03:45.000Z | README.md | chaos2022/RustBook | 0ffebe247ea686aed9015b29ffb8b4b2282f721e | [
"Apache-2.0"
] | null | null | null | README.md | chaos2022/RustBook | 0ffebe247ea686aed9015b29ffb8b4b2282f721e | [
"Apache-2.0"
] | null | null | null | ### Description [[简](./README_CN.md)、[繁](./README_TW.md)、[日](./README_JP.md)]
A book about [Rust programming language](https://www.rust-lang.org/) written in Chinese. This book contains 9 chapters in which are some data structures and algorithms with practical demos.
* Chapter 1: Computer Science
- Computer science
- Rust review and learning resources
* Chapter 2: Algorithm Analysis
- Big-O notation
* Chapter 3: Basic Data Structures
- Stack, Queue, Deque, List, Vec
* Chapter 4: Recursion
- Recursion theory, Tail-recursion ,Dynamic programming
* Chapter 5: Search
- Sequencial search, Binary search, Hashing search
* Chapter 6: Sort
- Ten basic sort algorithms
* Chapter 7: Tree
- Binary tree, Binary heap, Binary search tree, AVL tree
* Chapter 8: Graph
- Graph representation, BFS, DFS, Shortest path
* Chapter 9: Practice
- Edit Distance, Trie, Filter, LRU
- Consistent hashing, Base58, Blockchain
### Code
All demo codes are saved by chapter under `code/`.

### Stargazer

### Changelog
* 2022-02-15 add stargazer chart
* 2022-02-12 add code statistics
* 2022-02-09 fix typo and `substract with overflow` panic
* 2022-02-06 change code font to monospaced font: [Source Code Pro](https://github.com/adobe-fonts/source-code-pro)
* 2022-02-02 update to rust version 1.58
* 2022-01-31 upload code and the implementation
* 2021-04-24 upload first draft
| 34.204545 | 189 | 0.722924 | eng_Latn | 0.593518 |
11023c222f08981484ada6b4b0fe1712772b9c4f | 82 | md | Markdown | _includes/02-image.md | SaintMichaelArchangel/markdown-portfolio | f7b4a6cbc54120b7a9549bb579fdf6011ed40661 | [
"MIT"
] | null | null | null | _includes/02-image.md | SaintMichaelArchangel/markdown-portfolio | f7b4a6cbc54120b7a9549bb579fdf6011ed40661 | [
"MIT"
] | 5 | 2021-02-07T17:43:36.000Z | 2021-02-07T20:04:39.000Z | _includes/02-image.md | SaintMichaelArchangel/markdown-portfolio | f7b4a6cbc54120b7a9549bb579fdf6011ed40661 | [
"MIT"
] | null | null | null | 
| 41 | 81 | 0.756098 | yue_Hant | 0.589117 |
1103b940a60a932c76867ac5116874d5fb538861 | 3,078 | md | Markdown | docs/framework/data/adonet/how-to-bind-a-dataview-object-to-a-winforms-datagridview-control.md | MarktW86/dotnet.docs | 178451aeae4e2c324aadd427ed6bf6850e483900 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/data/adonet/how-to-bind-a-dataview-object-to-a-winforms-datagridview-control.md | MarktW86/dotnet.docs | 178451aeae4e2c324aadd427ed6bf6850e483900 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/data/adonet/how-to-bind-a-dataview-object-to-a-winforms-datagridview-control.md | MarktW86/dotnet.docs | 178451aeae4e2c324aadd427ed6bf6850e483900 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-01-06T09:36:01.000Z | 2021-01-06T09:36:01.000Z | ---
title: "How to: Bind a DataView Object to a Windows Forms DataGridView Control | Microsoft Docs"
ms.custom: ""
ms.date: "03/30/2017"
ms.prod: ".net-framework"
ms.reviewer: ""
ms.suite: ""
ms.technology:
- "dotnet-ado"
ms.tgt_pltfrm: ""
ms.topic: "article"
ms.assetid: 2b73d60a-6049-446a-85a7-3e5a68b183e2
caps.latest.revision: 2
author: "JennieHubbard"
ms.author: "jhubbard"
manager: "jhubbard"
---
# How to: Bind a DataView Object to a Windows Forms DataGridView Control
The <xref:System.Windows.Forms.DataGridView> control provides a powerful and flexible way to display data in a tabular format. The <xref:System.Windows.Forms.DataGridView> control supports the standard Windows Forms data binding model, so it will bind to <xref:System.Data.DataView> and a variety of other data sources. In most situations, however, you will bind to a <xref:System.Windows.Forms.BindingSource> component that will manage the details of interacting with the data source.
For more information about the <xref:System.Windows.Forms.DataGridView> control, see [DataGridView Control Overview](../../../../docs/framework/winforms/controls/datagridview-control-overview-windows-forms.md).
### To connect a DataGridView control to a DataView
1. Implement a method to handle the details of retrieving data from a database. The following code example implements a `GetData` method that initializes a <xref:System.Data.SqlClient.SqlDataAdapter> component and uses it to fill a <xref:System.Data.DataSet>. Be sure to set the `connectionString` variable to a value that is appropriate for your database. You will need access to a server with the AdventureWorks SQL Server sample database installed.
[!code-csharp[DP DataViewWinForms Sample#LDVSample1GetData](../../../../samples/snippets/csharp/VS_Snippets_ADO.NET/DP DataViewWinForms Sample/CS/Form1.cs#ldvsample1getdata)]
[!code-vb[DP DataViewWinForms Sample#LDVSample1GetData](../../../../samples/snippets/visualbasic/VS_Snippets_ADO.NET/DP DataViewWinForms Sample/VB/Form1.vb#ldvsample1getdata)]
2. In the <xref:System.Windows.Forms.Form.Load> event handler of your form, bind the <xref:System.Windows.Forms.DataGridView> control to the <xref:System.Windows.Forms.BindingSource> component and call the `GetData` method to retrieve the data from the database. The <xref:System.Data.DataView> is created from a [!INCLUDE[linq_dataset](../../../../includes/linq-dataset-md.md)] query over the Contact <xref:System.Data.DataTable> and is then bound to the <xref:System.Windows.Forms.BindingSource> component.
[!code-csharp[DP DataViewWinForms Sample#LDVSample1FormLoad](../../../../samples/snippets/csharp/VS_Snippets_ADO.NET/DP DataViewWinForms Sample/CS/Form1.cs#ldvsample1formload)]
[!code-vb[DP DataViewWinForms Sample#LDVSample1FormLoad](../../../../samples/snippets/visualbasic/VS_Snippets_ADO.NET/DP DataViewWinForms Sample/VB/Form1.vb#ldvsample1formload)]
## See Also
[Data Binding and LINQ to DataSet](../../../../docs/framework/data/adonet/data-binding-and-linq-to-dataset.md) | 85.5 | 511 | 0.768356 | yue_Hant | 0.375789 |
1104904f83109147d885215ec2390730dc196543 | 582 | md | Markdown | data/reusables/github-actions/github-token-permissions.md | Plotta/docs | dfc135a2bf91be244db6de86003348719a7e8cad | [
"CC-BY-4.0",
"MIT"
] | 11,698 | 2020-10-07T16:22:18.000Z | 2022-03-31T18:54:47.000Z | data/reusables/github-actions/github-token-permissions.md | Husky57/docs | 1d590a4feb780b0acc9a41381e721b61146175db | [
"CC-BY-4.0",
"MIT"
] | 8,317 | 2020-10-07T16:26:58.000Z | 2022-03-31T23:24:25.000Z | data/reusables/github-actions/github-token-permissions.md | Waleedalaedy/docs | 26d4b73dcbb9a000c32faa37234288649f8d211a | [
"CC-BY-4.0",
"MIT"
] | 48,204 | 2020-10-07T16:15:45.000Z | 2022-03-31T23:50:42.000Z | The `GITHUB_TOKEN` secret is set to an access token for the repository each time a job in a workflow begins. {% ifversion fpt or ghes > 3.1 or ghae or ghec %}You should set the permissions for this access token in the workflow file to grant read access for the `contents` scope and write access for the `packages` scope. {% else %}It has read and write permissions for packages in the repository where the workflow runs. {% endif %}For more information, see "[Authenticating with the GITHUB_TOKEN](/actions/configuring-and-managing-workflows/authenticating-with-the-github_token)."
| 291 | 581 | 0.783505 | eng_Latn | 0.998276 |
110515cb15496e8ad2322266238e99e66b61d8b0 | 2,964 | md | Markdown | docs/promtail/examples.md | JasonLiuMeng/loki | 717e49e3f859c1e3e2a0ef84aed5dc24e1d6253d | [
"Apache-2.0"
] | null | null | null | docs/promtail/examples.md | JasonLiuMeng/loki | 717e49e3f859c1e3e2a0ef84aed5dc24e1d6253d | [
"Apache-2.0"
] | null | null | null | docs/promtail/examples.md | JasonLiuMeng/loki | 717e49e3f859c1e3e2a0ef84aed5dc24e1d6253d | [
"Apache-2.0"
] | null | null | null | # Examples
This document shows some example use-cases for promtail and their configuration.
## Local Config
Using this configuration, all files in `/var/log` and `/srv/log/someone_service` are ingested into Loki.
The labels `job` and `host` are set using `static_configs`.
When using this configuration with Docker, do not forget to mount the configuration, `/var/log` and `/src/log/someone_service` using [volumes](https://docs.docker.com/storage/volumes/).
```yaml
server:
http_listen_port: 9080
grpc_listen_port: 0
positions:
filename: /tmp/positions.yaml # progress of the individual files
client:
url: http://ip_or_hostname_where_loki_runs:3100/api/prom/push
scrape_configs:
- job_name: system
pipeline_stages:
- docker: # Docker wraps logs in json. Undo this.
static_configs: # running locally here, no need for service discovery
- targets:
- localhost
labels:
job: varlogs
host: yourhost
__path__: /var/log/*.log # tail all files under /var/log
- job_name: someone_service
pipeline_stages:
- docker: # Docker wraps logs in json. Undo this.
static_configs: # running locally here, no need for service discovery
- targets:
- localhost
labels:
job: someone_service
host: yourhost
__path__: /srv/log/someone_service/*.log # tail all files under /srv/log/someone_service
```
## Systemd Journal
This example shows how to ship the `systemd` journal to Loki.
Just like the Docker example, the `scrape_configs` section holds various
jobs for parsing logs. A job with a `journal` key configures it for systemd
journal reading.
`path` is an optional string specifying the path to read journal entries
from. If unspecified, defaults to the system default (`/var/log/journal`).
`labels`: is a map of string values specifying labels that should always
be associated with each log entry being read from the systemd journal.
In our example, each log will have a label of `job=systemd-journal`.
Every field written to the systemd journal is available for processing
in the `relabel_configs` section. Label names are converted to lowercase
and prefixed with `__journal_`. After `relabel_configs` processes all
labels for a job entry, any label starting with `__` is deleted.
Our example renames the `_SYSTEMD_UNIT` label (available as
`__journal__systemd_unit` in promtail) to `unit** so it will be available
in Loki. All other labels from the journal entry are dropped.
When running using Docker, **remember to bind the journal into the container**.
```yaml
server:
http_listen_port: 9080
grpc_listen_port: 0
positions:
filename: /tmp/positions.yaml
clients:
- url: http://ip_or_hostname_where_loki_runns:3100/api/prom/push
scrape_configs:
- job_name: journal
journal:
path: /var/log/journal
labels:
job: systemd-journal
relabel_configs:
- source_labels: ['__journal__systemd_unit']
target_label: 'unit'
```
| 31.870968 | 185 | 0.741565 | eng_Latn | 0.969948 |
1105182b6d0d46d7ebd1a0af56c8d83f81f036e9 | 228 | md | Markdown | README.md | VikasKumarSaw/Space-Shooter-Arcade | e4414da36bd7ab6e56d05019b3023bb672146d35 | [
"MIT"
] | 4 | 2021-12-12T15:34:42.000Z | 2022-02-22T18:12:02.000Z | README.md | VikasKumarSaw/Space-Shooter-Arcade | e4414da36bd7ab6e56d05019b3023bb672146d35 | [
"MIT"
] | null | null | null | README.md | VikasKumarSaw/Space-Shooter-Arcade | e4414da36bd7ab6e56d05019b3023bb672146d35 | [
"MIT"
] | null | null | null | # Space-Shooter-Arcade
Space shooter game made in Javascript using Phaser.Js
hosted link : https://ra-raptor.github.io/Space-Shooter-Arcade/
## instructions :
Use Left and Right Arrow keys to move
Use Spacebar to shoot
| 25.333333 | 64 | 0.754386 | eng_Latn | 0.841956 |
1106e9104c078130cc6cac89d5213a6fe77a661e | 1,741 | md | Markdown | docs/visual-basic/language-reference/error-messages/end-of-statement-expected.md | zabereznikova/docs.de-de | 5f18370cd709e5f6208aaf5cf371f161df422563 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/visual-basic/language-reference/error-messages/end-of-statement-expected.md | zabereznikova/docs.de-de | 5f18370cd709e5f6208aaf5cf371f161df422563 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/visual-basic/language-reference/error-messages/end-of-statement-expected.md | zabereznikova/docs.de-de | 5f18370cd709e5f6208aaf5cf371f161df422563 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: end of-Anweisung erwartet.
ms.date: 07/20/2015
f1_keywords:
- bc30205
- vbc30205
helpviewer_keywords:
- BC30205
ms.assetid: 53c7f825-a737-4b76-a1fa-f67745b8bd40
ms.openlocfilehash: 36883989fe5aa0b5c70aa9ab1f7c991fab99e778
ms.sourcegitcommit: ff5a4eb5cffbcac9521bc44a907a118cd7e8638d
ms.translationtype: MT
ms.contentlocale: de-DE
ms.lasthandoff: 10/17/2020
ms.locfileid: "92163128"
---
# <a name="bc30205-end-of-statement-expected"></a>BC30205: Ende der Anweisung erwartet.
Die-Anweisung ist syntaktisch abgeschlossen, aber ein zusätzliches Programmier Element folgt dem-Element, das die-Anweisung abschließt. Am Ende jeder Anweisung ist ein Zeichen für den Zeilen Abschluss erforderlich.
Ein Zeichen für den Zeilen Abschluss dividiert die Zeichen einer Visual Basic Quelldatei in Zeilen. Beispiele für Zeilen Abschluss Zeichen sind das Unicode-Wagen Rücklauf Zeichen (&HD), das Unicode-Zeilenvorschub Zeichen (&ha) und das Unicode-Wagen Rücklauf Zeichen, gefolgt vom Unicode-Zeilenvorschub Zeichen. Weitere Informationen zu Zeilen Abschluss Zeichen finden Sie in der [Visual Basic-Sprachspezifikation](~/_vblang/spec/lexical-grammar.md#line-terminators).
**Fehler-ID:** BC30205
## <a name="to-correct-this-error"></a>So beheben Sie diesen Fehler
1. Überprüfen Sie, ob zwei unterschiedliche Anweisungen versehentlich in derselben Zeile abgelegt wurden.
2. Fügen Sie ein Zeilen Abschluss Zeichen nach dem Element ein, das die Anweisung abschließt.
## <a name="see-also"></a>Siehe auch
- [Vorgehensweise: Umbrechen und Zusammenfassen von Anweisungen in Code](../../programming-guide/program-structure/how-to-break-and-combine-statements-in-code.md)
- [Anweisungen](../../programming-guide/language-features/statements.md)
| 49.742857 | 467 | 0.802987 | deu_Latn | 0.972164 |
110738385efec48085ffa28c208c2a94b2b827f9 | 531 | md | Markdown | example/posts/What is HTTPS.md | Mochael/vue-blog | 714349f0662b7544ca93b988b2da037057e18f64 | [
"Apache-2.0"
] | null | null | null | example/posts/What is HTTPS.md | Mochael/vue-blog | 714349f0662b7544ca93b988b2da037057e18f64 | [
"Apache-2.0"
] | null | null | null | example/posts/What is HTTPS.md | Mochael/vue-blog | 714349f0662b7544ca93b988b2da037057e18f64 | [
"Apache-2.0"
] | null | null | null | ---
author: Michael
catalog: true
date: 2021-05-16
header_img: /img/in-post/new_header3.jpg
header_mask: rgba(40, 57, 101, .4)
header_style: image
subtitle: No subtitle
tags:
- notes
title: What is HTTPS
tree_state: 🌱
---
**HTTPS** (HyperText Transfer Protocol Secure) is an encrypted version of the [HTTP](https://developer.mozilla.org/en-US/docs/Glossary/HTTP) protocol. It used to use SSL (Secure Sockets Layer) encryption, but now uses TLS (Transport Layer Security) to encrypt all communication between a client and a server. | 35.4 | 308 | 0.762712 | eng_Latn | 0.935345 |
11093eb1eb5a9a1daaba0e51800e69b2a2ebd633 | 2,020 | md | Markdown | ui.md | fedotxxl/presentations | d956c5963a03d72be49e89ea46f171349dd083ee | [
"MIT"
] | null | null | null | ui.md | fedotxxl/presentations | d956c5963a03d72be49e89ea46f171349dd083ee | [
"MIT"
] | null | null | null | ui.md | fedotxxl/presentations | d956c5963a03d72be49e89ea46f171349dd083ee | [
"MIT"
] | null | null | null | # UI
## How we got here
### Before
Simple jquery frontend
#### pros:
* simple to implement
#### cons:
* hard to support (spaghetti code)
* slow frontend → a lot of traffic on each user action
* build system depends on server implementation
### Long way to choose technologies
* wav.tv → jquery → marrionettejs → anglarjs + reactjs
* ero-video.net → jquery → marrionettejs → angularjs?
* ero-pic.net (started 2015) → angularjs
* punyu (admin) (started 2016) → angularjs
### Where are we?
Frontend became more complicated because it serves more (features).
**We solve this complexity with appropriate technologies.**
### Our current technologies
We try to use same technologies on all new projects / move old projects to new technologies set (if it makes sense)
* CSS → SCSS
* GULP as build system (compiles SCSS / joins / minified css / js)
* AngularJS as JS framework
* ng-admin as admin building framework
### Why?
#### SCSS / GULP / AngularJS → leading technologies
* in general it means that it is good
* faster development
* larger community
** less bugs
** more information
** easy to find developers
#### ng-admin
* easy way to create common admin pages
* configurable
* less code to write → less bugs → faster development
### Where are we (pros / cons)
#### pros:
* faster fronted → less data to transfer by network
* faster development process → we can implement / deploy frontend / backend independently
* Separate frontend / backend:
** different developers for frontend and backend
** easier to support
#### cons:
* search engines (right now google tries to solve this problem)
* developer should remember about performance (most popular elements may be implemented with ReactJS)
### Where we want to go
Technologies evolves all the time... Our goal:
* Write maintainable code
* Fast dev / deploy cycle
* Use same techs on all projects
* (sometimes) write less code
=> be more productive and efficient
#### This year:
* ES6 (next version of JavaScript)
* AngularJS2
#### Next year:
… will see | 26.933333 | 115 | 0.731188 | eng_Latn | 0.995855 |
11094a7f71313e0c67c5d60d503124174c21badd | 127 | md | Markdown | changelog/2020.05.28/experimental-features/873-877.md | lava/vast | 0bc9e3c12eb31ec50dd0270626d55e84b2255899 | [
"BSD-3-Clause"
] | 249 | 2019-08-26T01:44:45.000Z | 2022-03-26T14:12:32.000Z | changelog/2020.05.28/experimental-features/873-877.md | 5l1v3r1/vast | a2cb4be879a13cef855da2c1d73083204aed4dff | [
"BSD-3-Clause"
] | 586 | 2019-08-06T13:10:36.000Z | 2022-03-31T08:31:00.000Z | changelog/2020.05.28/experimental-features/873-877.md | satta/vast | 6c7587effd4265c4a5de23252bc7c7af3ef78bee | [
"BSD-3-Clause"
] | 37 | 2019-08-16T02:01:14.000Z | 2022-02-21T16:13:59.000Z | Added a new `explore` command to VAST that can be used to show data records
within a certain time from the results of a query.
| 42.333333 | 75 | 0.779528 | eng_Latn | 0.999936 |
11097db6b70fe58005d4804b63c86f6c924c464b | 11,749 | md | Markdown | guides/openid-connect/sessions-loas.md | tanguilp/asteroid | 8e03221d365da7f03f82df192c535d3ba2101f4d | [
"Apache-2.0"
] | 36 | 2019-07-23T20:01:05.000Z | 2021-08-05T00:52:34.000Z | guides/openid-connect/sessions-loas.md | tanguilp/asteroid | 8e03221d365da7f03f82df192c535d3ba2101f4d | [
"Apache-2.0"
] | 19 | 2019-08-23T19:04:50.000Z | 2021-05-07T22:12:25.000Z | guides/openid-connect/sessions-loas.md | tanguilp/asteroid | 8e03221d365da7f03f82df192c535d3ba2101f4d | [
"Apache-2.0"
] | 3 | 2019-09-06T10:47:20.000Z | 2020-09-09T03:43:31.000Z | # Sessions and ACRs
Even though it's up to the developper to implement authentication and authorization web
workflows, Asteroid is capable of managing sessions, authentication class references (ACRs),
and associated concepts.
It supports the following use-cases:
- authentication step-up
- LOA decay
- offline access
It does not support any OpenID Connect logout specification.
This guide explains how to work with sessions.
## ACR configuration
LOAs are configured in the configuration file under the
[`oidc_acr_config`](Asteroid.Config.html#module-oidc_acr_config).
Here is one example of LOA confgiuration:
```elixir
config :asteroid, :oidc_acr_config, [
"3-factor": [
callback: &AsteroidWeb.LOA3_webflow.start_webflow/2,
auth_event_set: [["password", "otp", "webauthn"]]
],
"2-factor": [
callback: &AsteroidWeb.LOA2_webflow.start_webflow/2,
auth_event_set: [["password", "otp"], ["password", "webauthn"], ["webauthn", "otp"]]
],
"1-factor": [
callback: &AsteroidWeb.LOA1_webflow.start_webflow/2,
auth_event_set: [["password"], ["webauthn"]],
default: true
]
]
```
Note that the ACR name is an atom, which is converted back and forth to string when needed.
When initiating an OpenID Connect request on the authorize endpoint, Asteroid analyzes the
request to determine which ACR is requested, thanks to the the `"acr_values"` and `"claims"`
OpenID Connect parameters. It then processes it such as:
- if no LOA is requested, it uses the callback of the first configured LOA which is set as
`default: true`. If none of then is set as a default, it fall backs to the
[`:oauth2_flow_authorization_code_web_authorization_callback`](Asteroid.Config.html#module-oauth2_flow_authorization_code_web_authorization_callback)
or
[`:oauth2_flow_implicit_web_authorization_callback`](Asteroid.Config.html#module-oauth2_flow_implicit_web_authorization_callback)
callback
- if one or more LOAs are requested as the `"acr_values"`, or if one or more `"acr"` is requested
in the `"claims"` parameter as non essential, it uses the callback of the first
matching LOA of the above configuration option. If none matchs, it fall backs to
[`:oauth2_flow_authorization_code_web_authorization_callback`](Asteroid.Config.html#module-oauth2_flow_authorization_code_web_authorization_callback)
or
[`:oauth2_flow_implicit_web_authorization_callback`](Asteroid.Config.html#module-oauth2_flow_implicit_web_authorization_callback)
- if one or more `"acr"` is requested using the `"claims"` parameter as essential, it uses
the first matching LOA in the configuration option. In case there's no match, it returns an
error to the requester immediately, without calling any callback
## Authentication events
An authentication event is an event saved after successful authentication of a user, using a
specific authentication mechanism. Such an event can be:
- login with password
- login with WebAuthn
- login using an SMS OTP
- etc.
Those events ought to be registered in the authentication event store.
FIXME: describe configuration
An authentication event bears the following information:
- `:name`: the name of the event
- `:time`: the UNIX timestamp of the authentication event
- `:exp`: the UNIX time in the future when this authentication will be considered expired
- `:amr`: the authentication method reference. A `t:String.t/0` that gives information on the
specific mechanism used.
[OpenID Connect MODRNA Authentication Profile 1.0 - section 5](https://openid.net/specs/openid-connect-modrna-authentication-1_0.html#rfc.section.5)
gives such examples
- `:authenticated_session_id`: the identifier of the authenticated session
Note that having different `:event_name` and `:amr` allows implementing different authentication
policies for the same authentication mechanism, or different means used for the same
authentication event (eg with AMRs being either `"smsotp"` or `"emailotp"`).
## Authenticated session with the use of authentication events
An authenticated session represents a continuous period of time during which a user is
authenticated. By authenticated, it means that:
- one or more active authenticated events ongoing (and valid)
- this or these authentication events match a least an LOA in the LOA configuration
If these conditions are not met, the authenticated session is destroyed.
It is decorrelated from any client session mechanism such as web cookies. In other words, a
cookie used for authentication can be active a long time, but the user be unauthenticated. This
allows, for example, to save known users (for the "Select an account" interface) in the cookie
data without keeping the user authenticated.
It does not mean that the session mechanism is not aware of the authenticated session id. On the
contrary, it is probably needed that the web cookie is aware of the authenticated session id
since it cannot be found in another way (using the user id (`"sub"`) would not be sufficient as
the user can be logged in from several browsers).
An authenticated session is an object that bears the following data:
- `:authenticated_session_id`: the id of the object
- `:subject_id`: the user id
- `:current_acr`: the current acr, as calculated by Asteroid
An authenticated session can be destroyed in the following cases:
- it is manually destroyed, using convenience functions provided by the
`Asteroid.OIDC.AuthenticatedSession` module
- the last authenticated event related to an authenticated session is destroyed. In this case,
Asteroid detects that there is no longer a valid authentication event remaining
- this allows manually using authenticated sessions without using associated authentication
events, and therefore handling the LOA lifecycle in a non-automated way, bypassing Asteroid's
authenticated session management
## Stores
Authentication events and authenticated sessions are stored in their own stores. Refer to
[Token backends](token-stores.html#content) for more information.
## Processes calculating the current acr
The acr of an authenticated session object is recalculated:
- when an authentication event is created
- when an authentication event is destroyed
A process reads the existing valid authentication events from the authentication event store
and uses the [`oidc_acr_config`](Asteroid.Config.html#module-oidc_acr_config) configuration
option to determine the current acr. More specifically, it looks for a combination of
`:auth_event_set` that is equal or a subset (in the sense of comparing sets) of the current valid
authentication events of the session.
**Beware**, this search is done in list order of the configuration option. For instance, if the
current authentication events of a session is `["password", "otp"]` and with the following
configuration:
```elixir
config :asteroid, :oidc_loa_config, [
loa1: [
callback: AsteroidWeb.LOA1_webflow.start_webflow/2,
auth_event_set: [["password"], ["webauthn"], ["otp"]],
default: true
],
loa2: [
callback: AsteroidWeb.LOA2_webflow.start_webflow/2,
auth_event_set: [["password", "otp"], ["password", "webauthn"], ["webauthn", "otp"]]
]
]
```
the result would be `"loa1"`, since `["password", "otp"]` is sufficient to be considered
logged with a LOA of `"loa1"`. Order matters, so to have the expected result it is necessary
to change the order as follows:
```elixir
config :asteroid, :oidc_loa_config, [
loa2: [
callback: AsteroidWeb.LOA2_webflow.start_webflow/2,
auth_event_set: [["password", "otp"], ["password", "webauthn"], ["webauthn", "otp"]]
],
loa1: => [
callback: AsteroidWeb.LOA1_webflow.start_webflow/2,
auth_event_set: [["password"], ["webauthn"], ["otp"]],
default: true
]
]
```
The following schema illustrates how the current LOA can vary during the lifetime of an
authenticated session:

## Retrieving AMR and authentication time
The `Asteroid.OIDC.AuthenticatedSession.info/2` function allows retrieving the AMR and
authentication time of an authenticated session. It also allows requesting this information
for a specific acr.
Indeed, event if a user has just logged in,for example, with a second factor, some OpenID Connect
request can still request ID tokens for the 1-factor ACR. In this case, the amr will be a
unique factor, and the authentication time will be the authentication time with this first factor,
and not the second one.
### Example
```elixir
iex> Asteroid.Utils.astrenv(:oidc_acr_config)
[
"3-factor": [
callback: &AsteroidWeb.LOA3_webflow.start_webflow/2,
auth_event_set: [["password", "otp", "webauthn"]]
],
"2-factor": [
callback: &AsteroidWeb.LOA2_webflow.start_webflow/2,
auth_event_set: [
["password", "otp"],
["password", "webauthn"],
["webauthn", "otp"]
]
],
"1-factor": [
callback: &AsteroidWeb.LOA1_webflow.start_webflow/2,
auth_event_set: [["password"], ["webauthn"]],
default: true
]
]
iex> alias Asteroid.OIDC.AuthenticationEvent, as: AE
Asteroid.OIDC.AuthenticationEvent
iex> alias Asteroid.OIDC.AuthenticatedSession, as: AS
Asteroid.OIDC.AuthenticatedSession
iex> {:ok, as} = AS.gen_new("user_1") |> AS.store()
{:ok,
%Asteroid.OIDC.AuthenticatedSession{
data: %{},
id: "gvycEhbeig9RjwUu36UEGTxO1MNRE3qQ9WHisfpk0Zk",
subject_id: "user_1"
}}
iex> AE.gen_new(as.id) |> AE.put_value("name", "password") |> AE.put_value("amr", "pwd") |> AE.put_value("time", 100000) |> AE.store()
{:ok,
%Asteroid.OIDC.AuthenticationEvent{
authenticated_session_id: "gvycEhbeig9RjwUu36UEGTxO1MNRE3qQ9WHisfpk0Zk",
data: %{"amr" => "pwd", "name" => "password", "time" => 100000},
id: "WxQ6AHMRthQlk9cqsGUMVWsFNZ3EeNjyFfNCRYkiF20"
}}
iex> AE.gen_new(as.id) |> AE.put_value("name", "otp") |> AE.put_value("amr", "otp") |> AE.put_value("time", 200000)|> AE.store()
{:ok,
%Asteroid.OIDC.AuthenticationEvent{
authenticated_session_id: "gvycEhbeig9RjwUu36UEGTxO1MNRE3qQ9WHisfpk0Zk",
data: %{"amr" => "otp", "name" => "otp", "time" => 200000},
id: "QnZZE82I4St41JieLpLg8z3HG_T8l6yutlt3dPo_Yx8"
}}
iex> AE.gen_new(as.id) |> AE.put_value("name", "webauthn") |> AE.put_value("amr", "phr") |> AE.put_value("time", 300000)|> AE.store()
{:ok,
%Asteroid.OIDC.AuthenticationEvent{
authenticated_session_id: "gvycEhbeig9RjwUu36UEGTxO1MNRE3qQ9WHisfpk0Zk",
data: %{"amr" => "phr", "name" => "webauthn", "time" => 300000},
id: "N_V4i9lz5obd-3C0XZagZGtOFuDMZo0ywXSBjoum0KY"
}}
iex> AS.info(as.id)
%{acr: "3-factor", amr: ["otp", "phr", "pwd"], auth_time: 300000}
iex> AS.info(as.id, "1-factor")
%{acr: "1-factor", amr: ["pwd"], auth_time: 100000}
iex> AS.info(as.id, "2-factor")
%{acr: "2-factor", amr: ["otp", "pwd"], auth_time: 200000}
iex> AS.info(as.id, "3-factor")
%{acr: "3-factor", amr: ["otp", "phr", "pwd"], auth_time: 300000}
```
## Using authenticated session without authenticated events
Since an authenticated session is updated or destroyed when using authentication events, it is
possible to manually manage an authenticated session not using any authentication event linked
to this object.
This way, the current acr and other properties of an authenticated session can be updated
programmatically, with no automatic processes updating it.
## Relation between authentication objects and tokens
In a nutshell, in an OpenID Connect flow:
- the expiration of an authentication event may lead to its authenticated session object being
discarded...
- which may in turn destroy the refresh token**s** associated to it (if it wasn't requested with
the `"offline_access"` scope)...
- which will result in having the stored access tokens associated to this or these refresh
tokens being discarded altogether
| 42.879562 | 149 | 0.757682 | eng_Latn | 0.981836 |
1109dad1beb6253a5da00758f2c6782cc59098c7 | 14,263 | md | Markdown | articles/fin-ops-core/dev-itpro/migration-upgrade/data-upgrade-self-service.md | ryan-mcculloch/dynamics-365-unified-operations-public | 7aee6d287f07900b2d7f967d12f5cbe9b9d5daa3 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/fin-ops-core/dev-itpro/migration-upgrade/data-upgrade-self-service.md | ryan-mcculloch/dynamics-365-unified-operations-public | 7aee6d287f07900b2d7f967d12f5cbe9b9d5daa3 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/fin-ops-core/dev-itpro/migration-upgrade/data-upgrade-self-service.md | ryan-mcculloch/dynamics-365-unified-operations-public | 7aee6d287f07900b2d7f967d12f5cbe9b9d5daa3 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
# required metadata
title: Upgrade from AX 2012 - Data upgrade in self-service environments
description: This topic explains how to do a data upgrade from Microsoft Dynamics AX 2012 in self-service environments.
author: sarvanisathish
ms.date: 06/09/2021
ms.topic: article
audience: IT Pro
ms.reviewer: sericks
ms.search.region: Global
ms.author: tfehr
ms.search.validFrom: 2021-06-30
---
# Upgrade from AX 2012 - Data upgrade in self-service environments
[!include[banner](../includes/banner.md)]
> [!IMPORTANT]
> Some or all of the functionality noted in this topic is available as part of a preview release. The content and the functionality are subject to change.
## Prerequisites
1. Download and install the [.NET core software development kit (SDK)](https://dotnet.microsoft.com/download/dotnet/thank-you/sdk-3.1.409-windows-x64-installer) if it isn't already installed.
2. Create a self-service environment in Microsoft Dynamics Lifecycle Services (LCS). The environment should be in a **Deployed** state.
3. Make sure that the replication feature is installed and enabled for the source SQL Server instance. To determine whether replication is enabled, run the following SQL script.
```sql
-- If @installed is 0, replication must be added to the SQL Server installation.
USE master;
GO
DECLARE @installed int;
EXEC @installed = sys.sp_MS_replication_installed;
SELECT @installed;
```
If the replication components aren't installed, follow the steps in [Install SQL Server replication](/sql/database-engine/install-windows/install-sql-server-replication?view=sql-server-ver15) to install them.
4. Enable and start the SQL Server Agent on the source database server.
> [!NOTE]
> A user should have the **DB\_Owner** privilege in the source database, and should have access to the master database and the source database.
5. **Migration toolkit setup:** If you don't want some of the source database tables to be replicated in the target database, you can specify them in the IgnoreTables.xml file. Likewise, if you don't want some of the functions to be replicated, you can specify them in the IgnoreFunctions.xml file.
- **Path of the IgnoreTables.xml file:** Data\\IgnoreTables.xml
- **Path of the IgnoreFunctions.xml file:** Data\\IgnoreFunctions.xml
The following examples show how to specify tables and functions in the XML files.
```xml
<?xml version="1.0" encoding="utf-8"?>
<IgnoreTables>
<Name>
<Table>USERADDHISTORYLIST</Table>
<Table>TAXRECONCILIATIONREPORTTMP</Table>
<Table>CASELOG</Table>
<Table>SHAREDCATEGORYROLETYPE</Table>
<Table>VATCSREPORTXMLATTRIBUTE_CZ</Table>
</Name>
</IgnoreTables>
```
```xml
<?xml version="1.0" encoding="utf-8"?>
<IgnoreFunctions>
<Name>
<Function>if_WHSInventReserveUnionDelta</Function>
</Name>
</IgnoreFunctions>
```
> [!WARNING]
> The tables and functions that are specified in these XML files won't be replicated in the target database, and the same format should be followed.
> [!NOTE]
> If any of the preceding steps fail at any point, see the [Exceptions](#exceptions) section later in this topic.
## Configure replication
Before you begin the replication process, note that the LCS environment will be in a **Deployed** state when it's created.
1. Run the **AX2012DataUpgradeToolKit.exe** application.
A console window is opened, and you're redirected to the Microsoft sign-in page for authentication.
2. Provide the credentials that are used to sign in to LCS.
After you're successfully authenticated, you receive the following message in the browser: "Authentication complete. You can return to the application. Feel free to close this browser tab."
You can now close the browser tab.
3. In the console window, enter the **Project-Id** value, and then select **Enter**. Then provide the **Environment-Id** value, and select **Enter**.
> [!NOTE]
> You can find the **Project-Id** and **Environment-Id** values on the **Manage environment** page in LCS. You can also find the **Environment-Id** value on the **Environment details** page.
The application connects to the LCS environment and validates the connection to the target database.
After the validation is successful, the application presents a set of menu options that correspond to the steps in the data upgrade process. To complete the data replication and upgrade, you should perform these steps in order.
1. **Data upgrade preparation: Environment setup activity**
This step prompts you for the following information:
- Details of the source database:
- Source server (in the format *servername\\serverinstance*)
- Source database name
- User name
- Password
- IP address of the source database server (for the allowlist)
- Distribution database path (for example, **D:\\SQLServer\\Data**)
- Replication snapshot path (for example, **D:\\SQLServer\\Snapshot**)
> [!WARNING]
> The specified distribution database and replication snapshot paths should have enough space. We recommend that the amount of space be at least the size of the source database.
This step performs the following actions:
- Validate the connection to the source database.
- Validate the version of the AX 2012 database.
- Authorize the source IP address.
- Validate the target databases.
2. **Data upgrade preparation: Prepare the target environment for the data upgrade**
This step changes the state of the LCS environment from **Deployed** to **Ready for replication**.
3. **Replication: Clean-up target database**
This step performs the following actions:
1. Change the state of the LCS environment from **Ready for replication** to **Replication in progress**.
2. Delete the database owner (dbo) objects of tables, views, stored procedures, and user-defined functions in the target database.
4. **Replication: Set up distributor**
This step creates a distribution database under the **System Databases** folder on the source server. This distribution database is used for replication.
5. **Replication: Set up publication for primary key tables**
This step creates publications for primary key tables under the **Replication** folder on the source server and replicates them in the target database. If any **ignore-table** entries are specified, the specified tables are exempted from replication.
**Created publishers:** AXDB\_PUB\_TABLE\_Obj\_\[\*\]
> [!NOTE]
> After this replication configuration step is completed, actual data replication will occur as a SQL job that runs in the background. This job will take some time to be completed. You can view the status of the replication by providing the **'rs**' option.
6. **Replication: Set up publication for other objects (functions)**
This step creates a publication for other objects (functions) and replicates them in the target database. If you don't want some of the functions to be replicated, you can specify them in the IgnoreFunctions.xml file.
**Created publisher:** AX\_PUB\_OtherObjects
> [!NOTE]
> The replication will take some time to be completed. You can view the replication status by providing the **'rs'** option.
>
> If there no functions to replicate, the publication won't be created.
> [!WARNING]
> Don't move on to next step until the **DataReplicationStatus** property for this step is shown as completed.
7. **Cutover: Set up publication for non-primary key tables**
This step creates two publications: one that is used to replicate non-primary key tables, and one that is used to replicate locked tables.
**Publication names:** AX\_PUB\_NoPKTable, AX\_PUB\_TABLE\_LockedTable
If AX Service acquires a schema lock during creation of the primary key publication, those tables will be ignored and omitted from the publication. They will be added to temporary tables and marked for replication during creation of the cutover publication.
> [!WARNING]
> Don't move on to next step until the **DataReplicationStatus** property for this step is shown as completed.
8. **Cutover: Remove non-primary key publication and temporary tables**
This step performs the following actions:
1. Clean up the temporary tables that were created for non-primary key tables in the source database.
2. Delete the **AX\_PUB\_NoPKTable** publication.
9. **Cutover: Create constraint for non-primary key tables**
This step extracts constraints for the non-primary key tables from the source database and creates them in the target database.
10. **Cutover: Remove replication setup**
This step deletes all the publications that were created in the source database, the distribution database, and the replication snapshot.
> [!NOTE]
> To remove the **Snapshot** folder without causing an exception, run the following script in the source database. Even if you don't run this script, you can ignore the exception message that you receive.
>
> ```sql
> EXEC master.dbo.sp_configure 'show advanced options', 1
>
> RECONFIGURE WITH OVERRIDE
>
> EXEC master.dbo.sp_configure 'xp_cmdshell', 1
>
> RECONFIGURE WITH OVERRIDE
> ```
11. **Post-replication: Update environment state to Replicated**
This step changes the state of the LCS environment from **Replication in progress** to **Replication completed**.
12. **Data Upgrade: Trigger upgrade**
This step performs the following actions:
1. While the data upgrade is occurring, change the state of the LCS environment from **Replication completed** to **Data upgrade in progress**.
2. After the data upgrade is completed, change the state of the LCS environment from **Data upgrade in progress** to **Deployed**.
At this point, the data upgrade process is completed. The status of all the steps in the process should be shown as completed.
If data upgrade fails, the state of the LCS environment will be **Failed**, and the status of the menu option for the failed step will be **Resume** in the console application. In this case, resume the failed step. The state of the LCS environment is then changed to **Servicing**.
## Reporting section of the application
You can use the following options to review the reports of the replication validation, replication status, and data upgrade status:
- **dv) Report:** Validate the replication.
This option compares the number of tables and records in the source server database and the target server database, and then shows the report. You should use this option only after step 12 is completed.
You can find the report data at **output/PostValidationInfo.csv**.
- **rs) Report:** Get the replication status.
This option shows the report of the replication process for the publications that were created. You should use this option only after step 3 is started (that is, during the replication process for any publication).
- **ds) Report:** Get the data upgrade status.
This option shows the report of the data upgrade process. You should use this option only after step 12 is started.
## Tooling section of the application
- **Reset:** Reset the replication setup by removing all the replication configurations. Publications and the distribution database are deleted. The status of all menu options is reset from **Completed** to **Reset** mode to help you redo the replication from the beginning.
- **Reset-all:** Reset all the menu options.
**Reset** option and **Clear** option will be performed. All the options will be changed to **Not Started**.
- **Clear:** Clear the environment setup activity. All information is cleared from the cache, such as the **project-Id** value, **Environment-Id** value, and source database details. The status of step 1 is changed to **Not Started**.
- **Help:** Show the data upgrade migration. The menu option status is shown together with the updated status.
- **Exit:** Close the application.
## Learn about the replication configuration and status via SQL Server Management Studio
In SQL Server Management Studio (SSMS), if Object Explorer includes a **Replication** folder, the replication feature is installed on the server and available.
After step 3 of the data upgrade process is completed, you should find the publisher configured under the **Replication** folder. To learn the replication status, select and hold (or right-click) the **Replication** folder, and then select **Launch Replication Monitor**.
- In the replication monitor, you can view all the publishers that have been created for replication.
- On the **Snapshot** tab, you can view the status of the snapshot.
- To view the detail log/transaction, double-tap (or double-click) a grid item.
- To view the data replication to the target, on the **All Subscription** tab, double-tap (or double-click) the subscription from the grid item.
## Exceptions
- If the distribution database already exists on the source database server, follow these steps to delete it:
1. On the source database server, expand the **Replication** folder, select and hold (or right-click) the **Local Publications** folder, and then select **Generate Scripts**.
A tab that is named **Generate SQL Script** is opened.
2. Select the **To drop or disable the components** option, and then, in the **Generate Script** field, select **Open in New Query Window**.
The script is opened in a new query window.
3. Run the script in the source database.
- After you create a publication, snapshot creation might fail, and you might receive the following error message: "Prefetch objects failed for Database 'MicrosoftDynamicsAX'."
To fix this exception, follow these steps:
1. Select and hold (or right-click) the **Replication** folder, and then select **Launch Replication Monitor**.
2. Select and hold (or right-click) the failed publication, and then select **Generate Snapshot**.
3. After the snapshot is generated, you can view the replication status by using the **'rs'** option.
| 49.524306 | 298 | 0.739746 | eng_Latn | 0.993125 |
1109ea0141b62ae59e6332f03f93afe59dac3942 | 11,883 | md | Markdown | libs/azure/src/iothub_client/devdoc/requirement_docs/iothubtransportamqp_ws_requirements.md | InfiniteYuan1/gprs_a9_azure | dec715c24e82a86ef6a80183409fb93d5daefd8d | [
"MIT"
] | 1 | 2018-01-16T21:08:45.000Z | 2018-01-16T21:08:45.000Z | libs/azure/src/iothub_client/devdoc/requirement_docs/iothubtransportamqp_ws_requirements.md | InfiniteYuan1/gprs_a9_azure | dec715c24e82a86ef6a80183409fb93d5daefd8d | [
"MIT"
] | null | null | null | libs/azure/src/iothub_client/devdoc/requirement_docs/iothubtransportamqp_ws_requirements.md | InfiniteYuan1/gprs_a9_azure | dec715c24e82a86ef6a80183409fb93d5daefd8d | [
"MIT"
] | 5 | 2018-01-16T20:57:34.000Z | 2020-10-14T15:11:42.000Z |
# IoTHubTransportAMQP_WS Requirements
================
## Overview
IoTHubTransportAMQP_WS is the library that enables communications with the iothub system using the AMQP protocol over Websockets with TLS connection.
## Dependencies
iothubtransport_amqp_common
azure_c_shared_utility
## Exposed API
```c
static const TRANSPORT_PROVIDER* AMQP_Protocol_over_WebSocketsTls(void);
```
The following static functions are provided in the fields of the TRANSPORT_PROVIDER structure:
- IoTHubTransportAMQP_WS_Subscribe_DeviceMethod,
- IoTHubTransportAMQP_WS_SendMessageDisposition,
- IoTHubTransportAMQP_WS_Unsubscribe_DeviceMethod,
- IoTHubTransportAMQP_WS_Subscribe_DeviceTwin,
- IoTHubTransportAMQP_WS_Unsubscribe_DeviceTwin,
- IoTHubTransportAMQP_WS_ProcessItem,
- IoTHubTransportAMQP_WS_GetHostname,
- IoTHubTransportAMQP_WS_SetOption,
- IoTHubTransportAMQP_WS_Create,
- IoTHubTransportAMQP_WS_Destroy,
- IoTHubTransportAMQP_WS_SetRetryLogic,
- IoTHubTransportAMQP_WS_Register,
- IoTHubTransportAMQP_WS_Unregister,
- IoTHubTransportAMQP_WS_Subscribe,
- IoTHubTransportAMQP_WS_Unsubscribe,
- IoTHubTransportAMQP_WS_DoWork,
- IoTHubTransportAMQP_WS_GetSendStatus,
- IotHubTransportAMQP_WS_Subscribe_InputQueue,
- IotHubTransportAMQP_WS_Unsubscribe_InputQueue
## IoTHubTransportAMQP_WS_Create
```c
static TRANSPORT_LL_HANDLE IoTHubTransportAMQP_WS_Create(const IOTHUBTRANSPORT_CONFIG* config)
```
**SRS_IOTHUBTRANSPORTAMQP_WS_09_001: [**IoTHubTransportAMQP_WS_Create shall create a TRANSPORT_LL_HANDLE by calling into the IoTHubTransport_AMQP_Common_Create function, passing `config` and getWebSocketsIOTransport.**]**
### getWebSocketsIOTransport
```c
static XIO_HANDLE getWebSocketsIOTransport(const char* fqdn, const AMQP_TRANSPORT_PROXY_OPTIONS* amqp_transport_proxy_options)
```
**SRS_IOTHUBTRANSPORTAMQP_WS_01_001: [** `getIoTransportProvider` shall obtain the WebSocket IO interface handle by calling `wsio_get_interface_description`. **]**
**SRS_IOTHUBTRANSPORTAMQP_WS_01_002: [** `getIoTransportProvider` shall call `xio_create` while passing the WebSocket IO interface description to it and the WebSocket configuration as a WSIO_CONFIG structure, filled as below: **]**
**SRS_IOTHUBTRANSPORTAMQP_WS_01_003: [** - `hostname` shall be set to `fqdn`. **]**
**SRS_IOTHUBTRANSPORTAMQP_WS_01_004: [** - `port` shall be set to 443. **]**
**SRS_IOTHUBTRANSPORTAMQP_WS_01_005: [** - `protocol` shall be set to `AMQPWSB10`. **]**
**SRS_IOTHUBTRANSPORTAMQP_WS_01_006: [** - `resource_name` shall be set to `/$iothub/websocket`. **]**
**SRS_IOTHUBTRANSPORTAMQP_WS_01_007: [** - `underlying_io_interface` shall be set to the TLS IO interface description. **]**
**SRS_IOTHUBTRANSPORTAMQP_WS_01_008: [** - `underlying_io_parameters` shall be set to the TLS IO arguments. **]**
**SRS_IOTHUBTRANSPORTAMQP_WS_01_009: [** `getIoTransportProvider` shall obtain the TLS IO interface handle by calling `platform_get_default_tlsio`. **]**
**SRS_IOTHUBTRANSPORTAMQP_WS_01_010: [** The TLS IO parameters shall be a `TLSIO_CONFIG` structure filled as below: **]**
**SRS_IOTHUBTRANSPORTAMQP_WS_01_011: [** - `hostname` shall be set to `fqdn`. **]**
**SRS_IOTHUBTRANSPORTAMQP_WS_01_012: [** - `port` shall be set to 443. **]**
**SRS_IOTHUBTRANSPORTAMQP_WS_01_013: [** - If `amqp_transport_proxy_options` is NULL, `underlying_io_interface` shall be set to NULL. **]**
**SRS_IOTHUBTRANSPORTAMQP_WS_01_014: [** - If `amqp_transport_proxy_options` is NULL `underlying_io_parameters` shall be set to NULL. **]**
**SRS_IOTHUBTRANSPORTAMQP_WS_01_015: [** - If `amqp_transport_proxy_options` is not NULL, `underlying_io_interface` shall be set to the HTTP proxy IO interface description. **]**
**SRS_IOTHUBTRANSPORTAMQP_WS_01_016: [** - If `amqp_transport_proxy_options` is not NULL `underlying_io_parameters` shall be set to the HTTP proxy IO arguments. **]**
**SRS_IOTHUBTRANSPORTAMQP_WS_01_022: [** `getIoTransportProvider` shall obtain the HTTP proxy IO interface handle by calling `http_proxy_io_get_interface_description`. **]**
**SRS_IOTHUBTRANSPORTAMQP_WS_01_023: [** The HTTP proxy IO arguments shall be an `HTTP_PROXY_IO_CONFIG` structure, filled as below: **]**
**SRS_IOTHUBTRANSPORTAMQP_WS_01_024: [** - `hostname` shall be set to `fully_qualified_name`. **]**
**SRS_IOTHUBTRANSPORTAMQP_WS_01_025: [** - `port` shall be set to 443. **]**
**SRS_IOTHUBTRANSPORTAMQP_WS_01_026: [** - `proxy_hostname`, `proxy_port`, `username` and `password` shall be copied from the `mqtt_transport_proxy_options` argument. **]**
**SRS_IOTHUBTRANSPORTAMQP_WS_01_028: [** If `http_proxy_io_get_interface_description` returns NULL, NULL shall be set in the TLS IO parameters structure for the interface description and parameters. **]**
**SRS_IOTHUBTRANSPORTAMQP_WS_01_029: [** If `platform_get_default_tlsio` returns NULL, NULL shall be set in the WebSocket IO parameters structure for the interface description and parameters. **]**
**SRS_IOTHUBTRANSPORTAMQP_WS_09_003: [**If `io_interface_description` is NULL getWebSocketsIOTransport shall return NULL.**]**
**SRS_IOTHUBTRANSPORTAMQP_WS_09_004: [**getWebSocketsIOTransport shall return the XIO_HANDLE created using xio_create().**]**
## IoTHubTransportAMQP_WS_Destroy
```c
static void IoTHubTransportAMQP_WS_Destroy(TRANSPORT_LL_HANDLE handle)
```
**SRS_IOTHUBTRANSPORTAMQP_WS_09_005: [**IoTHubTransportAMQP_WS_Destroy shall destroy the TRANSPORT_LL_HANDLE by calling into the IoTHubTransport_AMQP_Common_Destroy().**]**
## IoTHubTransportAMQP_WS_Register
```c
static IOTHUB_DEVICE_HANDLE IoTHubTransportAMQP_WS_Register(TRANSPORT_LL_HANDLE handle, const IOTHUB_DEVICE_CONFIG* device, IOTHUB_CLIENT_LL_HANDLE iotHubClientHandle, PDLIST_ENTRY waitingToSend)
```
**SRS_IOTHUBTRANSPORTAMQP_WS_09_006: [**IoTHubTransportAMQP_WS_Register shall register the device by calling into the IoTHubTransport_AMQP_Common_Register().**]**
## IoTHubTransportAMQP_WS_Unregister
```c
static void IoTHubTransportAMQP_WS_Unregister(IOTHUB_DEVICE_HANDLE deviceHandle)
```
**SRS_IOTHUBTRANSPORTAMQP_WS_09_007: [**IoTHubTransportAMQP_WS_Unregister shall unregister the device by calling into the IoTHubTransport_AMQP_Common_Unregister().**]**
## IoTHubTransportAMQP_WS_Subscribe_DeviceTwin
```c
int IoTHubTransportAMQP_WS_Subscribe_DeviceTwin(IOTHUB_DEVICE_HANDLE handle)
```
**SRS_IOTHUBTRANSPORTAMQP_WS_09_008: [**IoTHubTransportAMQP_WS_Subscribe_DeviceTwin shall invoke IoTHubTransport_AMQP_Common_Subscribe_DeviceTwin() and return its result.**]**
## IoTHubTransportAMQP_WS_Unsubscribe_DeviceTwin
```c
void IoTHubTransportAMQP_WS_Unsubscribe_DeviceTwin(IOTHUB_DEVICE_HANDLE handle)
```
**SRS_IOTHUBTRANSPORTAMQP_WS_09_009: [**IoTHubTransportAMQP_WS_Unsubscribe_DeviceTwin shall invoke IoTHubTransport_AMQP_Common_Unsubscribe_DeviceTwin()**]**
## IoTHubTransportAMQP_WS_Subscribe_DeviceMethod
```c
int IoTHubTransportAMQP_WS_Subscribe_DeviceMethod(IOTHUB_DEVICE_HANDLE handle)
```
**SRS_IOTHUBTRANSPORTAMQP_WS_09_010: [**IoTHubTransportAMQP_WS_Subscribe_DeviceMethod shall invoke IoTHubTransport_AMQP_Common_Subscribe_DeviceMethod() and return its result.**]**
## IoTHubTransportAMQP_WS_Unsubscribe_DeviceMethod
```c
void IoTHubTransportAMQP_WS_Unsubscribe_DeviceMethod(IOTHUB_DEVICE_HANDLE handle)
```
**SRS_IOTHUBTRANSPORTAMQP_WS_09_011: [**IoTHubTransportAMQP_WS_Unsubscribe_DeviceMethod shall invoke IoTHubTransport_AMQP_Common_Unsubscribe_DeviceMethod()**]**
## IoTHubTransportAMQP_WS_Subscribe
```c
static int IoTHubTransportAMQP_WS_Subscribe(TRANSPORT_LL_HANDLE handle)
```
**SRS_IOTHUBTRANSPORTAMQP_WS_09_012: [**IoTHubTransportAMQP_WS_Subscribe shall subscribe for D2C messages by calling into the IoTHubTransport_AMQP_Common_Subscribe().**]**
## IoTHubTransportAMQP_WS_Unsubscribe
```c
static void IoTHubTransportAMQP_WS_Unsubscribe(TRANSPORT_LL_HANDLE handle)
```
**SRS_IOTHUBTRANSPORTAMQP_WS_09_013: [**IoTHubTransportAMQP_WS_Unsubscribe shall subscribe for D2C messages by calling into the IoTHubTransport_AMQP_Common_Unsubscribe().**]**
## IoTHubTransportAMQP_WS_ProcessItem
```c
static IOTHUB_PROCESS_ITEM_RESULT IoTHubTransportAMQP_WS_ProcessItem(TRANSPORT_LL_HANDLE handle, IOTHUB_IDENTITY_TYPE item_type, IOTHUB_IDENTITY_INFO* iothub_item)
```
**SRS_IOTHUBTRANSPORTAMQP_WS_09_014: [**IoTHubTransportAMQP_WS_ProcessItem shall invoke IoTHubTransport_AMQP_Common_ProcessItem() and return its result.**]**
## IoTHubTransportAMQP_WS_DoWork
```c
static void IoTHubTransportAMQP_WS_DoWork(TRANSPORT_LL_HANDLE handle, IOTHUB_CLIENT_LL_HANDLE iotHubClientHandle)
```
**SRS_IOTHUBTRANSPORTAMQP_WS_09_015: [**IoTHubTransportAMQP_WS_DoWork shall call into the IoTHubTransport_AMQP_Common_DoWork()**]**
## IoTHubTransportAMQP_WS_GetSendStatus
```c
IOTHUB_CLIENT_RESULT IoTHubTransportAMQP_WS_GetSendStatus(TRANSPORT_LL_HANDLE handle, IOTHUB_CLIENT_STATUS *iotHubClientStatus)
```
**SRS_IOTHUBTRANSPORTAMQP_WS_09_016: [**IoTHubTransportAMQP_WS_GetSendStatus shall get the send status by calling into the IoTHubTransport_AMQP_Common_GetSendStatus()**]**
## IoTHubTransportAMQP_WS_SetOption
```c
IOTHUB_CLIENT_RESULT IoTHubTransportAMQP_WS_SetOption(TRANSPORT_LL_HANDLE handle, const char* optionName, const void* value)
```
**SRS_IOTHUBTRANSPORTAMQP_WS_09_017: [**IoTHubTransportAMQP_WS_SetOption shall set the options by calling into the IoTHubTransport_AMQP_Common_SetOption()**]**
## IoTHubTransportAMQP_WS_GetHostname
```c
static STRING_HANDLE IoTHubTransportAMQP_WS_GetHostname(TRANSPORT_LL_HANDLE handle)
```
**SRS_IOTHUBTRANSPORTAMQP_WS_09_018: [**IoTHubTransportAMQP_WS_GetHostname shall get the hostname by calling into the IoTHubTransport_AMQP_Common_GetHostname()**]**
## IoTHubTransportAMQP_WS_SendMessageDisposition
```c
static STRING_HANDLE IoTHubTransportAMQP_WS_SendMessageDisposition(TRANSPORT_LL_HANDLE handle)
```
**SRS_IOTHUBTRANSPORTAMQP_WS_10_001 [**IoTHubTransportAMQP_WS_SendMessageDisposition shall sned the message disposition by calling into the IoTHubTransport_AMQP_Common_SendMessageDisposition()**]**
## IotHubTransportAMQP_WS_Subscribe_InputQueue
```c
static int IotHubTransportAMQP_WS_Subscribe_InputQueue(IOTHUB_DEVICE_HANDLE handle)
```
Not implemented (yet).
## IotHubTransportAMQP_WS_Unsubscribe_InputQueue
```c
static void IotHubTransportAMQP_WS_Unsubscribe_InputQueue(IOTHUB_DEVICE_HANDLE handle)
```
Not implemented (yet).
### AMQP_Protocol_over_WebSocketsTls
```c
const TRANSPORT_PROVIDER* AMQP_Protocol_over_WebSocketsTls(void)
```
**SRS_IOTHUBTRANSPORTAMQP_WS_09_019: [**This function shall return a pointer to a structure of type TRANSPORT_PROVIDER having the following values for it's fields:
IoTHubTransport_SendMessageDisposition = IoTHubTransportAMQP_WS_SendMessageDisposition
IoTHubTransport_Subscribe_DeviceMethod = IoTHubTransportAMQP_WS_Subscribe_DeviceMethod
IoTHubTransport_Unsubscribe_DeviceMethod = IoTHubTransportAMQP_WS_Unsubscribe_DeviceMethod
IoTHubTransport_Subscribe_DeviceTwin = IoTHubTransportAMQP_WS_Subscribe_DeviceTwin
IoTHubTransport_Unsubscribe_DeviceTwin = IoTHubTransportAMQP_WS_Unsubscribe_DeviceTwin
IoTHubTransport_ProcessItem - IoTHubTransportAMQP_WS_ProcessItem
IoTHubTransport_GetHostname = IoTHubTransportAMQP_WS_GetHostname
IoTHubTransport_Create = IoTHubTransportAMQP_WS_Create
IoTHubTransport_Destroy = IoTHubTransportAMQP_WS_Destroy
IoTHubTransport_Subscribe = IoTHubTransportAMQP_WS_Subscribe
IoTHubTransport_Unsubscribe = IoTHubTransportAMQP_WS_Unsubscribe
IoTHubTransport_DoWork = IoTHubTransportAMQP_WS_DoWork
IoTHubTransport_SetRetryLogic = IoTHubTransportAMQP_WS_SetRetryLogic
IoTHubTransport_SetOption = IoTHubTransportAMQP_WS_SetOption
IoTHubTransport_Subscribe_InputQueue = IoTHubTransportAMQP_WS_Subscribe_InputQueue
IoTHubTransport_Unsubscribe_InputQueue = IotHubTransportAMQP_WS_Unsubscribe_InputQueue**]**
| 42.288256 | 231 | 0.83144 | yue_Hant | 0.908228 |
110a8c0e7d75708e0f2a8326e1cd5eb8847bd82b | 14,712 | md | Markdown | MicrosoftSearch/plan-your-content.md | isabella232/OfficeDocs-MicrosoftSearch-pr.de-DE | fa6990a974520a8a6f570028846cddb52b857fb5 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-04-21T00:13:50.000Z | 2021-04-21T00:13:50.000Z | MicrosoftSearch/plan-your-content.md | MicrosoftDocs/OfficeDocs-MicrosoftSearch-pr.de-DE | 424fa2a4d178f734e8a6626100304bf6acea8e32 | [
"CC-BY-4.0",
"MIT"
] | 2 | 2022-02-09T07:07:44.000Z | 2022-02-09T07:08:04.000Z | MicrosoftSearch/plan-your-content.md | isabella232/OfficeDocs-MicrosoftSearch-pr.de-DE | fa6990a974520a8a6f570028846cddb52b857fb5 | [
"CC-BY-4.0",
"MIT"
] | 2 | 2019-10-12T18:39:28.000Z | 2021-10-09T11:16:03.000Z | ---
title: Planen der Inhalte
ms.author: jeffkizn
author: jeffkizn
manager: parulm
ms.audience: Admin
ms.topic: reference
ms.service: mssearch
ms.localizationpriority: medium
search.appverid:
- BFB160
- MET150
- MOE150
ms.assetid: bb9d90b6-6c86-4b19-9235-3bd9b19826ab
description: Bereitstellen hochwertige Inhalte ohne zusätzliche Ressourcen bei Verwendung von Microsoft Search
ms.openlocfilehash: 4382bac3ab8833cc8967e9ebb7ea607bdfcef7f1
ms.sourcegitcommit: ca5ee826ba4f4bb9b9baabc9ae8a130011c2a3d0
ms.translationtype: MT
ms.contentlocale: de-DE
ms.lasthandoff: 09/15/2021
ms.locfileid: "59376067"
---
# <a name="plan-your-content"></a>Planen der Inhalte
**Microsoft Search** hilft Benutzern, relevante Inhalte zu finden. **Microsoft Search** ist eine sichere Möglichkeit, sowohl Ihre Intranet- als auch Ihre Webinhalte zu durchsuchen. Diese web- und organisationsübergreifende Art der Integration steht nur bei Microsoft zur Verfügung.
Suchadministratoren nutzen ihr Wissen über die Organisation und ihre Benutzer, um die relevanten Inhalte für Benutzer leicht auffindbar zu machen.
## <a name="step-1-identify-information-your-users-need"></a>Schritt 1: Identifizieren der Informationen, die Ihre Benutzer benötigen
Finden Sie heraus, wonach Ihre Benutzer suchen, und machen Sie das leicht auffindbar. Hier sind einige Ideen, wie Sie herausfinden können, welche Informationen Benutzer benötigen:
- Ermitteln Sie die Websites und Seiten mit dem meisten Datenverkehr anhand von Intranetsuchprotokollen.
- Ermitteln Sie Apps, Websites und Tools, die täglich oder wöchentlich verwendet werden.
- Suchen Sie direkte Links für Mitarbeitervergütungen.
- Suchen Sie Richtlinien und Prozesse, die Benutzer kennen müssen.
- Entscheiden Sie, mit „wem“ und „wie“ Benutzer den Support kontaktieren sollten.
- Rufen Sie Informationen ab, die auf einer regelmäßigen Basis benötigt werden – entweder jahreszeitlich bedingt oder auf der Grundlage von Geschäftszyklen; Beispiel: Personen die nach Tools zum Reservieren von arbeitsfreier Zeit oder für vierteljährliche wertmäßige Aktualisierungen suchen.
- Sammeln Sie Richtlinien für regionale oder mobile Benutzer wie z. B. Vorteile, die je nach Standort variieren.
- Ermitteln Sie interne Websites und Informationen zu allgemeinen Suchen im Web; Beispiel: Verkehr, Informationen zum öffentlichen Nahverkehr, lokales Wetter, verfügbare Rabatte bei Firmenpartnern sowie Gesundheits- und Fitnessprogramme.
- Suchen Sie Informationen zu vom Unternehmen geförderten Veranstaltungen, Konferenzen oder Erholungsangeboten.
- Recherchieren Sie allgemeine Probleme bei IT, Personalwesen und Support sowie häufig gestellte Fragen (FAQs) und die Antworten darauf.
## <a name="step-2-leverage-subject-matter-experts-smes-and-users"></a>Schritt 2: Nutzen von Experten für das Fachgebiet (SMEs) und Benutzern
In einer Organisation suchen Benutzer nach einem breiten Spektrum von Themen, das von einfachen Themen wie Büroadressen und Mitarbeitervergütungen bis hin zu komplexen Themen wie neuen Arbeitsprozessen, technischen Informationen und Anleitungen reicht. Das Erstellen oder Auffinden eines solchen breiten Spektrums von Inhalten erfordert Kenntnisse und Kompetenzen in verschiedenen Bereichen, Themen, Technologien usw., und ein Suchadministrator verfügt möglicherweise nicht über die erforderlichen Kompetenzen oder Kenntnisse. Administratoren sollten die Kompetenzen und Kenntnisse anderer Personen in der Organisation nutzen, um die Menge der verfügbaren Inhalte ohne zusätzliche Ressourcen zu skalieren.
### <a name="leverage-smes"></a>Nutzen von SMEs
Nutzen Sie SMEs in der Organisation, einschließlich Experten von der Personalabteilung, dem Support, dem Vertrieb, der Technologie und anderen Schlüsselbereichen. Damit Ihre SMEs Inhalte direkt beisteuern können, fügen Sie sie als Sucheditoren hinzu.
### <a name="involve-your-users"></a>Einbeziehen Ihrer Benutzer
Bitten Sie Benutzer, Ressourcen als Lesezeichen vorzuschlagen. Bitte Sie Benutzer, zusätzlich zum Vorschlagen von Inhalten Fehler wie fehlerhafte oder ungültige Links zu melden.
## <a name="step-3-improve-findability-of-content"></a>Schritt 3: Verbessern der Auffindbarkeit von Inhalten
In **Microsoft Search** erstellt der Suchadministrator Lesezeichen, Fragen und Antworten, Standorte und PowerApps, um die Auffindbarkeit von Inhalten zu verbessern. Jede dieser Suchkomponenten enthält einen Titel, eine URL und eine Gruppe von Schlüsselwörtern, die sie auslösen.
### <a name="titles-and-descriptions"></a>Titel und Beschreibungen
Benutzer verwenden Titel und Beschreibungen, um zu ermitteln, ob das Ergebnis ihre Suchabfrage beantwortet oder ob sie eine andere Suche ausprobieren müssen. Titel und Beschreibungen sollten den grundlegenden Zweck des Ergebnisses widerspiegeln. Ein gutes Beispiel für einen Titel könnte „Vergütungen für Kinderbetreuung“ mit der Beschreibung „Informationen zu Vergütungen als Hilfe bei der Bezahlung von Kinderbetreuungskosten“ sein. So werden die Benutzer, die nach „Kinderbetreuung“ suchen, informiert, dass finanzielle Unterstützungsleistungen verfügbar sind. Außerdem wird ein Link zu weiteren Informationen bereitgestellt.
### <a name="keywords"></a>Schlüsselwörter
Schlüsselwörter sind die Begriffe, die Personen in Ihrer Organisation verwenden, um relevante Inhalte zu finden. Die Zuordnung der entsprechenden Schlüsselwörter zu den Suchergebnissen erleichtert das Auffinden der relevanten Inhalte. **Microsoft Search** schlägt ein Schlüsselwort basierend auf dem Titel und der URL für Ihren Inhalt vor. Beantworten Sie zur Identifizierung zusätzlicher Schlüsselwörter zunächst die folgenden Fragen:
- Mit welchen Suchbegriffen wird nach den Informationen gesucht, die Sie identifiziert haben?
- Nutzen Sie ggf. die vorhandene Taxonomie in Ihrer Organisation sowie zugehörige Varianten, Akronyme und Themen.
- Welche anderen Varianten oder Wörter verwenden Personen, um über diese Informationen zu sprechen?
- Wenden Sie sich an Ihr Supportteam, um diese Schlüsselwörter zu ermitteln.
Wenn Sie beispielsweise ein Ergebnis erstellen, das zu einem Tool zum Einreichen von Urlaubsanträgen verlinkt, sind Schlüsselwörter wie „Urlaub“ und „Urlaubsantrag einreichen“ gute Optionen zur Einbeziehung. Möglicherweise stellen Sie fest, dass Personen in Ihrer Organisation „Ferien“ oder „arbeitsfreie Zeit“ verwenden, um urlaubsbezogene Informationen zu beschreiben oder zu suchen. Wenn Schlüsselwörter wie „Ferien“, „arbeitsfreie Zeit“, „Antrag für Ferien einreichen“ oder „Urlaubsplanung“ hinzugefügt werden, können mehr Benutzer die relevanten Inhalte einfacher finden.
### <a name="reserved-keywords"></a>Reservierte Schlüsselwörter
Ein reserviertes Schlüsselwort ist ein eindeutiger Begriff oder Ausdruck, der ein Ergebnis auslöst. Im Gegensatz zu anderen Schlüsselwörtern kann ein reserviertes Schlüsselwort nur einem Ergebnis zugeordnet werden. Verwenden Sie reservierte Schlüsselwörter nur sparsam, damit **Microsoft Search** auf der Grundlage ihrer Nutzung lernen kann.
Wenn Sie beispielsweise ein Lesezeichen für eine Website zum Einreichen Ihrer Stunden erstellen und „Protokollzeit“ als reserviertes Schlüsselwort hinzufügen, sehen Benutzer in Ihrer Organisation, die nach „Protokollzeit“ suchen, die Website zum Einreichen Ihrer Stunden als einziges Lesezeichen im Feld **Microsoft Search**.
### <a name="using-keyword-to-group-related-content"></a>Verwenden von Schlüsselwörtern zum Gruppieren verwandter Inhalte
Wenn Benutzer bei ihrer Suche Gruppen von verwandten Inhalten finden sollen, suchen sie nach einem Begriff, versuchen dann, das gleiche Schlüsselwort für alle verwandten Inhalte zu verwenden. Beispiel: Wenn Sie Ergebnisse über Prozesse und Tools im Zusammenhang mit Änderungen des Familienstands hinzufügen, könnten Sie ein Schlüsselwort wie „Ehe“ einbeziehen, um Ergebnisse im Hinblick auf das Aktualisieren von Vergütungen, Steuerinformationen sowie Namens- und Aliasänderungen zusammen zu gruppieren.
### <a name="search-settings"></a>Sucheinstellungen
Verwenden Sie die Sucheinstellungen, um Ihre Inhalte anzupassen und bestimmte Gruppen von Benutzern anzusprechen. **Microsoft Search** hat die folgenden Einstellungen, die Ihnen zusätzliche Kontrolle darüber geben, wann ein Suchergebnis angezeigt wird und wer es sieht.
- **Datumsangaben:** Legen Sie ein Start- und ein Enddatum fest, um zu steuern, wann Inhalte verfügbar oder nicht verfügbar sind. Beispielsweise wird zeitkritisches Material im Suchergebnis angezeigt, wenn es relevant ist.
- **Land/Region:** Wählen Sie Länder oder Regionen aus, sodass nur Benutzer an diesen Standorten den Inhalt sehen. Beispielsweise werden länderspezifische Informationen nur in diesen Ländern in den Suchergebnissen angezeigt.
- **Gruppen:** Verwenden Sie die Gruppeneinstellungen, um ein Ergebnis nur Mitgliedern einer ausgewählten Gruppe zur Verfügung zu stellen. Wenn Sie beispielsweise Websites erstellen, die nur Mitarbeiter in der Personalabteilung betreffen, könnten Sie diese Einstellung der entsprechenden Sicherheitsgruppe „Personalwesen“ zuordnen.
- **Gerät und Betriebssystem:** Wählen Sie Gerätetypen oder Betriebssysteme aus, sodass das betreffende Lesezeichen nur Benutzern angezeigt wird, die auf diesen Geräten oder unter diesen Systemen suchen.
- **Gezielte Varianten:** Mit dieser Einstellung können Sie den Inhalt des Lesezeichens je nach dem Gerät und Standort des Benutzers variieren.
## <a name="step-4-test-your-content"></a>Schritt 4: Testen Ihres Inhalts
Nachdem Sie Lesezeichen sowie Fragen und Antworten erstellt haben, ist es wichtig, zu überprüfen, dass:
- das richtige Lesezeichen oder die richtige Frage und Antwort angezeigt werden
- alle Inhalte, die mithilfe von Schlüsselwörtern gruppiert sind, zusammen als geplant angezeigt werden
- keine unerwarteten Ergebnisse im Suchergebnis angezeigt werden
- Überprüfen Sie, ob das Lesezeichen oder die Frage und Antwort genügend Informationen haben.
Benutzer und SMEs, die zur Inhaltserstellung beigetragen haben, können dabei helfen, das Suchergebnis zu testen und zu validieren.
## <a name="step-5-use-insights-to-review-and-update-periodically"></a>Schritt 5: Verwenden von Einblicken zur periodischen Überprüfung und Aktualisierung
Es ist wichtig, dass autoritative Informationen wie Lesezeichen sowie Fragen und Antworten auf dem neuesten Stand sind.
- Beheben oder entfernen Sie fehlerhafte oder ungültige URLs.
- Entfernen Sie Lesezeichen oder Fragen und Antworten, die nicht mehr relevant sind.
- Überprüfen Sie auf Änderungen des Tools, Websitenamens oder Teamnamens.
- Überlegen Sie, ob das Lesezeichen oder die Fragen und Antworten genügend autoritativ sind oder eine klarere Beschreibung benötigen.
**Microsoft Search** stellt Nutzungsstatistiken für Lesezeichen, Fragen und Antworten und Standorte bereit. Die Nutzungsstatistiken zeigen, wie Ihre Benutzer mit den Suchergebnissen interagieren, ob die Benutzer das Gewünschte suchen und ob Lücken in den verfügbaren Inhalten vorhanden sind. Sie helfen dem Administrator, die Leistung zu überwachen und geeignete Maßnahmen zur Feinabstimmung der Suchergebnisse zu ergreifen.
### <a name="get-details-about-bookmarks-qa-and-locations"></a>Abrufen von Details zu Lesezeichen, Fragen und Antworten und Standorten
Zeigen Sie an, wie viele Lesezeichen, Fragen und Antworten und Standorte veröffentlicht, geplant oder vorgeschlagen wurden. Verwenden Sie das Dashboard, um die Gesamtanzahl von Lesezeichen, Fragen und Antworten oder Standorten nach Status anzuzeigen:
- **Veröffentlicht**: die Anzahl der veröffentlichten Ergebnissen, die für Benutzer zur Verfügung stehen
- **Geplant:** die Anzahl der geplanten Ergebnissen in der Veröffentlichungspipeline
- **Vorgeschlagen:** die Anzahl der Vorschläge von Benutzern
Vorgeschlagene Lesezeichen, Fragen und Antworten und Standorte sind ein guter Indikator für Lücken in Ihren Inhalten. Sie helfen Ihnen zu verstehen, wonach Ihre Benutzer suchen und was sie nicht finden können. Dies könnte darauf hindeuten, dass Sie mehr Lesezeichen, Fragen und Antworten oder Standorte erstellen müssen oder Ihre vorhandenen Inhalte aktualisieren müssen, indem Sie bessere Schlüsselwörter, reservierte Schlüsselwörter und Suchzeichenfolgen verwenden, um die Auffindbarkeit von Inhalten zu verbessern.
### <a name="review-top-search-queries"></a>Überprüfen der wichtigsten Suchabfragen
Ermitteln Sie, welche Suchen die meisten Aufrufe während der letzten 90 Tage generiert haben. Aufruf bezieht sich darauf, wie oft eine Seite im Suchergebnis angezeigt wurde. Die Karte **Wichtigste Abfragen** zeigt die 25 wichtigsten Benutzersuchen für jeden Ergebnistyp mit der Gesamtanzahl von Suchen und deren Klickrate (Click-Through Rate, CTR). Mithilfe dieses Berichts können Sie das Suchvorgangsaufkommen identifizieren und Abfragen mit hoher und niedriger Suchaktivität ermitteln.
Eine geringe Anzahl von Suchen deutet möglicherweise auf eine Unzufriedenheit bei den Benutzern hin, entweder weil die Benutzer nicht nach diesem Suchinhalt suchen oder andere Schlüsselwörter verwenden, um den jeweiligen Inhalt zu finden. Die CTR zeigt an, wie oft Benutzer die höhergestuften Ergebnisse auswählen und wie nützlich Ihre Abfrageregeln und -ergebnisse für Benutzer sind. Eine niedrige CTR zeigt an, dass Benutzer den Inhalt finden, aber feststellen, dass der Inhalt nicht ihrer Suche entspricht. In solchen Fällen beschließen Administratoren möglicherweise, den Inhalt zu überprüfen und sicherzustellen, dass er mit den Such- und Updatetiteln, -beschreibungen und -schlüsselwörtern des Benutzers übereinstimmt, um ihn mit den Suchabfragen des Benutzers abzustimmen.
### <a name="analyze-impressions-by-result-type"></a>Analysieren von Aufrufen nach Ergebnistyp
Einfach zu lesende Diagramme in der Karte **Aufrufverteilung nach Ergebnistyp** zeigen Aufrufe über verschiedene Zeitrahmen. Die Zeitachse zeigt die tägliche Anzahl von Aufrufen für einen Ergebnistyp. Ermitteln Sie, welcher Ergebnistyp am häufigsten oder seltensten verwendet wird. Die seltene Verwendung eines bestimmten Ergebnistyps bedeutet nicht unbedingt, dass die Ergebnistypen nicht gut sind. Sie zeigt nur, wie Benutzer das Suchergebnis verwenden.
Verwenden Sie diesen Bericht, um zu verstehen, welche Ergebnistypen Benutzer verwenden und welche Änderungen im Benutzerverhalten über einen bestimmten Zeitraum auftreten. Wenn ein bestimmter Ergebnistyp von Benutzern bevorzugt wird, beschließen Administratoren möglicherweise, mehr Suchergebnisse desselben Typs zu erstellen oder die Schlüsselwörter von Ergebnistypen, die von Benutzern nicht verwendet werden, zu überprüfen, um sicherzustellen, dass die Schlüsselwörter angemessen sind.
| 106.608696 | 779 | 0.827964 | deu_Latn | 0.999535 |
110afea0e3f1e8ef34bf79982388ff2926a0ef64 | 419 | md | Markdown | README.md | arif08/opt | 6f3d0ae7df8481d587c068b75c5adaf04ba655bc | [
"Apache-2.0"
] | null | null | null | README.md | arif08/opt | 6f3d0ae7df8481d587c068b75c5adaf04ba655bc | [
"Apache-2.0"
] | null | null | null | README.md | arif08/opt | 6f3d0ae7df8481d587c068b75c5adaf04ba655bc | [
"Apache-2.0"
] | null | null | null | grails-spring-security-otp
==========================
The Spring Security OTP plugin adds One-Time Password ([Oath][oath]'s [Time-based One-Time Password RFC 6238][rfc6238] algorithm) authentication to a Grails application that uses [Spring Security][spring-security].
[oath]:http://www.openauthentication.org/
[rfc6238]:http://tools.ietf.org/html/rfc6238
[spring-security]:http://www.springsource.org/spring-security | 52.375 | 214 | 0.739857 | kor_Hang | 0.66638 |
110b087b7cc090488673cd79048b8e94dbbe2db7 | 56 | md | Markdown | README.md | dtroshynski76/chatproject | 125f3f63a02f17756e043dd4d60e927a55fbc0f5 | [
"MIT"
] | null | null | null | README.md | dtroshynski76/chatproject | 125f3f63a02f17756e043dd4d60e927a55fbc0f5 | [
"MIT"
] | null | null | null | README.md | dtroshynski76/chatproject | 125f3f63a02f17756e043dd4d60e927a55fbc0f5 | [
"MIT"
] | null | null | null | # chatproject
A chat application for one of my classes.
| 18.666667 | 41 | 0.785714 | eng_Latn | 0.997484 |
110b11cea9935065812d40995565463330ed49ca | 4,023 | md | Markdown | README.md | nbanmp/gef | ffce545784104c8097f5f25c82f29d4b998b2350 | [
"MIT"
] | 5 | 2021-05-09T12:51:32.000Z | 2021-11-04T11:02:54.000Z | README.md | nbanmp/gef | ffce545784104c8097f5f25c82f29d4b998b2350 | [
"MIT"
] | null | null | null | README.md | nbanmp/gef | ffce545784104c8097f5f25c82f29d4b998b2350 | [
"MIT"
] | 3 | 2021-05-12T12:14:05.000Z | 2021-10-06T05:19:54.000Z | # GDB Enhanced Features (a.k.a. GEF)
<p align="center">
<img src="https://i.imgur.com/v3PUqPx.png" alt="logo"/>
</p>
`GEF` (pronounced ʤɛf - "Jeff") is a set of commands for x86/64, ARM, MIPS, PowerPC and SPARC to assist exploit developers and reverse-engineers when using old school GDB. It provides additional features to GDB using the Python API to assist during the process of dynamic analysis and exploit development. Application developers will also benefit from it, as GEF lifts a great part of regular GDB obscurity, avoiding repeating traditional commands, or bringing out the relevant information from the debugging runtime.

## Instant Setup ##
Simply make sure you have [GDB 7.7 or higher](https://www.gnu.org/s/gdb) compiled with Python3 bindings, then:
```bash
# via the install script
$ wget -q -O- https://github.com/hugsy/gef/raw/master/scripts/gef.sh | sh
# manually
$ wget -O ~/.gdbinit-gef.py -q https://github.com/hugsy/gef/raw/master/gef.py
$ echo source ~/.gdbinit-gef.py >> ~/.gdbinit
```
Then just start playing:
```bash
$ gdb -q /path/to/my/bin
gef➤ gef help
```
_Note_: As of January 2020, GEF doesn't officially support Python 2 any longer, due to Python 2 becoming officially deprecated.
If you really need GDB+Python2, use [`gef-legacy`](https://github.com/hugsy/gef-legacy) instead.
## Highlights ##
A few of `GEF` features include:
* **One** single GDB script.
* Entirely **OS Agnostic**, **NO** dependencies: `GEF` is battery-included and is installable in 2 seconds (unlike [PwnDBG](https://github.com/pwndbg/pwndbg)).
* **Fast** limiting the number of dependencies and optimizing code to make the
commands as fast as possible (unlike _PwnDBG_).
* Provides a great variety of commands to drastically change your experience in GDB.
* **Easily** extendable to create other commands by providing more comprehensible
layout to GDB Python API.
* Works consistently on both Python2 and Python3.
* Built around an architecture abstraction layer, so all commands work in any
GDB-supported architecture such as x86-32/64, ARMv5/6/7, AARCH64, SPARC, MIPS,
PowerPC, etc. (unlike [PEDA](https://github.com/longld/peda))
* Suited for real-life apps debugging, exploit development, just as much as
CTF (unlike _PEDA_ or _PwnDBG_)
Check out the [Screenshot page](docs/screenshots.md) for more.
Or [try it online](https://demo.gef.blah.cat) (user:`gef`/password:`gef-demo`)
## Documentation ##
Unlike other GDB plugins, GEF has an extensive and up-to-date [documentation](https://gef.readthedocs.io/). Users are recommended to refer to it as it may help them in their attempts to use GEF. In particular, new users should navigate through it (see the [FAQ](https://gef.readthedocs.io/en/master/faq/) for common installation problems), and the problem persists, try to reach out for help on the IRC channel or submit an issue.
## Current status ##
| Documentation | License | Compatibility | IRC | Test validation |
|--|--|--|--|--|
| [](https://gef.readthedocs.org/en/master/) | [](https://github.com/hugsy/gef/blob/master/LICENSE) | [](https://github.com/hugsy/gef/) | [](https://webchat.freenode.net/?channels=##gef) | [](https://circleci.com/gh/hugsy/gef/tree/master) |
## Contribute ##
To get involved, refer to the [Contribution documentation](https://gef.readthedocs.io/en/master/#contribution) and the [guidelines](https://github.com/hugsy/gef/blob/dev/.github/CONTRIBUTING.md) to start.
And special thanks to [Pedro "TheZakMan" Araujo](https://thezakman.tumblr.com/) for the logo!.
## Happy Hacking ##
| 50.2875 | 616 | 0.732289 | eng_Latn | 0.901739 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.