hexsha
stringlengths 40
40
| size
int64 5
1.04M
| ext
stringclasses 6
values | lang
stringclasses 1
value | max_stars_repo_path
stringlengths 3
344
| max_stars_repo_name
stringlengths 5
125
| max_stars_repo_head_hexsha
stringlengths 40
78
| max_stars_repo_licenses
sequencelengths 1
11
| max_stars_count
int64 1
368k
⌀ | max_stars_repo_stars_event_min_datetime
stringlengths 24
24
⌀ | max_stars_repo_stars_event_max_datetime
stringlengths 24
24
⌀ | max_issues_repo_path
stringlengths 3
344
| max_issues_repo_name
stringlengths 5
125
| max_issues_repo_head_hexsha
stringlengths 40
78
| max_issues_repo_licenses
sequencelengths 1
11
| max_issues_count
int64 1
116k
⌀ | max_issues_repo_issues_event_min_datetime
stringlengths 24
24
⌀ | max_issues_repo_issues_event_max_datetime
stringlengths 24
24
⌀ | max_forks_repo_path
stringlengths 3
344
| max_forks_repo_name
stringlengths 5
125
| max_forks_repo_head_hexsha
stringlengths 40
78
| max_forks_repo_licenses
sequencelengths 1
11
| max_forks_count
int64 1
105k
⌀ | max_forks_repo_forks_event_min_datetime
stringlengths 24
24
⌀ | max_forks_repo_forks_event_max_datetime
stringlengths 24
24
⌀ | content
stringlengths 5
1.04M
| avg_line_length
float64 1.14
851k
| max_line_length
int64 1
1.03M
| alphanum_fraction
float64 0
1
| lid
stringclasses 191
values | lid_prob
float64 0.01
1
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
21aa6bc17cfd946cb1829e53786703c9534397a0 | 939 | md | Markdown | windows.system.remotesystems/remotesystem_isauthorizationkindenabled_756735636.md | gbaychev/winrt-api | 25346cd51bc9d24c8c4371dc59768e039eaf02f1 | [
"CC-BY-4.0",
"MIT"
] | 199 | 2017-02-09T23:13:51.000Z | 2022-03-28T15:56:12.000Z | windows.system.remotesystems/remotesystem_isauthorizationkindenabled_756735636.md | gbaychev/winrt-api | 25346cd51bc9d24c8c4371dc59768e039eaf02f1 | [
"CC-BY-4.0",
"MIT"
] | 2,093 | 2017-02-09T21:52:45.000Z | 2022-03-25T22:23:18.000Z | windows.system.remotesystems/remotesystem_isauthorizationkindenabled_756735636.md | gbaychev/winrt-api | 25346cd51bc9d24c8c4371dc59768e039eaf02f1 | [
"CC-BY-4.0",
"MIT"
] | 620 | 2017-02-08T19:19:44.000Z | 2022-03-29T11:38:25.000Z | ---
-api-id: M:Windows.System.RemoteSystems.RemoteSystem.IsAuthorizationKindEnabled(Windows.System.RemoteSystems.RemoteSystemAuthorizationKind)
-api-type: winrt method
---
<!-- Method syntax.
public bool RemoteSystem.IsAuthorizationKindEnabled(RemoteSystemAuthorizationKind kind)
-->
# Windows.System.RemoteSystems.RemoteSystem.IsAuthorizationKindEnabled
## -description
Checks whether the client device is authorized to discover other users' devices or just same-user devices.
## -parameters
### -param kind
The [RemoteSystemAuthorizationKind](remotesystemauthorizationkind.md) to check.
## -returns
Returns **true** if the client device is set to *kind* authorization scheme, otherwise **false**.
## -remarks
This authorization is a system-wide setting. The user can view and change it by going to Settings > System > Shared experiences.
## -see-also
## -examples
## -capabilities
remoteSystem
| 28.454545 | 140 | 0.761448 | eng_Latn | 0.840422 |
21aa9102c0549fcd15a32f4e07dd09fb1bcc6bd2 | 30,883 | md | Markdown | _posts/2021-01-29-Discrete_mathematics_useless.md | junyahuang/junyahuang.github.io | 122ca54d2bb90f1a107a4fec2153f1794dd3f281 | [
"Apache-2.0"
] | null | null | null | _posts/2021-01-29-Discrete_mathematics_useless.md | junyahuang/junyahuang.github.io | 122ca54d2bb90f1a107a4fec2153f1794dd3f281 | [
"Apache-2.0"
] | null | null | null | _posts/2021-01-29-Discrete_mathematics_useless.md | junyahuang/junyahuang.github.io | 122ca54d2bb90f1a107a4fec2153f1794dd3f281 | [
"Apache-2.0"
] | null | null | null | ---
layout: post
title: 离散数学笔记废案
date: 2021-01-29
categories: marginalia
tags: [课堂笔记]
description: 网课网课网课
header-img: "img/headline7.jpg"
catalog: true
typora-root-url: ../../junyahuang.github.io
---
# 谓词逻辑
## 基本推理形式和蕴含关系
### 推理的基本形式
- 所谓推理,是指从已租前提合乎逻辑的推出结论的思维过程。在这里,我们使用命题公式来表达前提和结论
- 设 $G_{1}, G_{2}, \cdots, G_{n},$ $H$ 是公式,称 $H$ 是 $G_{1}, G_{2}, \cdots, G_{n}$ 的逻辑结果当且仅当对任意解释 $I$, 如果 $I$ 使得G $_{1} \wedge G_{2} \wedge \cdots \wedge G_{n}$ 为真,则 $I$ 也会使 H为真。记为$G_{1}, G_{2}, \cdots, G_{n} \Rightarrow H_{0}$ “ $\Rightarrow$ ”称为蕴涵关系。此时称 $G_{1}, G_{2}, \cdots, G_{n} \Rightarrow H$ 为有效的, 否则称为无效的。 $G_{1}, G_{2}, \cdots, G_{n}$称为一组前提,有时用集合 $\Gamma$ (伽马) 来表示,记为 $\Gamma=\left\{G_{1}, G_{2}, \cdots, G_{n}\right\}, H$ 称为结论。此时 也称 $H$ 是前提集合 $\Gamma$ 的逻辑结果。记为 $\Gamma \Rightarrow H_{0}$
- 有效的推理不一定是正确的推理
### 推理的判定定理
- 公式 $H$ 是前提集合 $\Gamma=\left\{G_{1}, G_{2}, \cdots, G_{n}\right\}$ 的逻辑结果当且仅当 $\left(G_{1} \wedge G_{2} \wedge \cdots \wedge G_{n}\right) \rightarrow H$为永真公式。
- 判定方法
- 真值表技术
- 公式转换法
- 主析取范式法
#### 实例
- 判断推理 $P \rightarrow Q, P \Rightarrow Q$ 是否有效?
- 方法一:真值表技术
| $P$ | $Q$ | $((P \rightarrow Q) \wedge P) \rightarrow Q$ |
| :--: | :--: | :------------------------------------------: |
| 0 | 0 | 1 |
| 0 | 1 | 1 |
| 1 | 0 | 1 |
| 1 | 1 | 1 |
- 方法二:公式转换法
$((P \rightarrow Q) \wedge P) \rightarrow Q$
$=\neg((\neg P \vee Q) \wedge P) \vee Q$
$=\neg(\neg P \vee Q) \vee \neg P \vee Q$
$=\neg(\neg P \vee Q) \vee(\neg P \vee Q)$
$=1$
- 方法三:主析取范式法
$((P \rightarrow Q) \wedge P) \rightarrow Q$
$=\neg((\neg P \vee Q) \wedge P) \vee Q$
$=\neg(\neg P \vee Q) \vee \neg P \vee Q$
$=(P \wedge \neg Q) \vee \neg P \vee Q$
$=(P \wedge \neg Q) \vee(\neg P \wedge(\neg Q \vee Q)) \vee((\neg P \vee P) \wedge Q)$
$=(\neg P \wedge \neg Q) \vee(\neg P \wedge Q) \vee(P \wedge \neg Q) \vee(P \wedge Q)\left(m_{0} \vee m_{1} \vee m_{2} \vee m_{3}\right)$
包括了所有的极小项,故为永真公式
### 推理定律——基本蕴含关系
设 $G, H, I$ 为任意的命题公式
以下,左侧为真则右侧为真:
- $I_{1}: G \wedge H \Rightarrow G ; \quad I_{2}: G \wedge H \Rightarrow H$(简化规则)
- $I_{3}: G \Rightarrow G \vee H ; \quad I_{4}: H \Rightarrow G \vee H$(添加规则)
- $I_{5}: G, H \Rightarrow G \wedge H$(合取引入规则)
- $I_{6}: G \vee H, \neg G \Rightarrow H ; \quad I_{7}: G \vee H, \neg H \Rightarrow G$(选言三段论)
- $I_{8}: G \rightarrow H, G \Rightarrow H$(假言推理规则)
- “假如天气晴朗,则我们去旅游”,若“天气确实晴朗”,则推出“我们去旅游”为真
- $I_{9}: G \rightarrow H, \neg H \Rightarrow \neg G$(否定后件事)
- “假如天气晴朗,则我们去旅游”,若“天气不晴朗”,则推出“我们不去旅游”为真
- $I_{10}: G \rightarrow H, H \rightarrow I \Rightarrow G \rightarrow I$(假言三段论)
- 如果有 $G$ 则有 $H$,如果有 $H$ 则有 $I$,推出如果有 $G$,必然有 $I$
- $I_{11}: G \vee H, G \rightarrow I, H \rightarrow I \Rightarrow A$(二难推论)
- $G$ 析取 $H$ 为真,有 $G$ 则有 $I$,有 $H$ 则有 $I$,推出 $I$ 为真
#### 实例
- 如果 a 是偶数,则 a 能被 2 整除:a 是偶数。所以,a 能被 2 整除
- 可描述为 $: P \rightarrow Q, P \Rightarrow Q$(假言推理规则)
- 如果一个人是单身汉,则他不幸福:如果一个人不幸福,则他死得旱。所以,单身汉死得早
- 可描述为 $: P \rightarrow Q, Q \rightarrow R \Rightarrow P \rightarrow R$(假言三段论)
- 若你发电子邮件告诉我密码,则我将完成程序的编写:我没有完成程序的编写。所以,你没 有发电子邮件告诉我密码
- 可描述为 $: P \rightarrow Q, \neg Q \Rightarrow \neg P \quad$(否定后件式)
- 这个案件的凶手肯定是王某或陈某 ; 经过调查,王某不是凶手。所以,陈某是凶手
- 可描述为 $: P \vee Q, \neg P \Rightarrow Q$(选言三段论)
## 自然演绎法推理
### 推理规则
- 规则 $P$ (称为前提引用规则):在推导的过程中,可随时引入前提集合中的任意一个前提
- 规则 $T$ (称为逻辑结果引用规则):在推导的过程中,可以随时引入公式 $S,$ 该公式 $S$ 是由其 前的一个或多个公式推导出来的逻辑结果
- 规则 $C P$ (称为附加前提规则):如果能从给定的前提集合 $\Gamma$ 与公式 $P$ 推导出 $S,$ 则能从此 前提集合 $\Gamma$ 推导出 $P \rightarrow S_{\circ}$
- 原理 $: P \rightarrow(Q \rightarrow R)=(P \wedge Q) \rightarrow R$
- 使用场合:当结论公式是**蕴含式或析取式**时使用
- 三个推理规则加上全部的基本等价公式和基本蕴含公式,可作为推理与演绎的基础,从而构造一个完整的命题验算推理系统。即所有命题逻辑的定理都可以用这些规则严格地证明出来
### 演绎法推理
- 从前提集合 $\Gamma$ 推出结论 $H$ 的一个演支是构造命题公式的一个有限序列:
- $H_{1}, H_{2}, H_{3}, \cdots, H_{n-1}, H_{n}$
- 其中,$H_{i}$ 或者是 $\Gamma$ 中的某个前提,或者是前面的某些 $H_{j}(j<i)$ 的**有效结论**,并且 $H_{n}$ 就是 $H$ ,则称公式 $H$ 为该演支的有效结论,或者称从前提 $\Gamma$ 能够演支出结论 $H$ 来。
#### 演绎:直接证明法
设前提集合 $\Gamma=\{P \vee Q, Q \rightarrow R, P \rightarrow S, \neg S\},$ 结论 $H=R \wedge(P \vee Q)$
证明 $\Gamma \Rightarrow H$
| 结论 | 推理条件和方法 |
| ------------------------------- | -------------- |
| $(1)\quad$ $P \rightarrow S$ | $P$ |
| $(2)\quad$ $\neg S$ | $P$ |
| $(3)\quad$ $\neg P$ | $T,(1),(2),I$ |
| $(4)\quad$ $P \vee Q$ | $P$ |
| $(5)\quad$ $Q$ | $T,(3),(4), I$ |
| $(6)\quad$ $Q \rightarrow R$ | $P$ |
| $(7)\quad$ $R$ | $T,(5),(6), I$ |
| $(8)\quad$ $R \wedge(P \vee Q)$ | $T,(4),(7), I$ |
- $I$ 表示使用基本蕴含关系
- $E$ 表示使用基本等价关系
- 思考步骤:
- 先考虑得到 $R$,已经条件 $Q \rightarrow R$,则需要知道 $Q$ 为真
- 与 $Q$ 有关的条件为 $P \vee Q$,则需要知道 $\neg P$ 为真
- 与 $\neg P$ 有关的条件为 $P \rightarrow S$、$\neg S$
- 则知 $\neg P$ 为真,随后一步步倒推即可
#### 演绎:规则 $CP$ 证明法
设前提集合 $\Gamma=\{P \rightarrow(Q \rightarrow S), \neg R \vee P, Q\},$ 结论 $H=R \rightarrow S$
证明 $\Gamma \Rightarrow H_{\circ}$
| 结论 | 推理条件和方法 |
| ------------------------------------------ | --------------- |
| $(1)\quad R$ | $P$(附加前提) |
| $(2)\quad \neg R \vee P$ | $P$ |
| $(3)\quad P$ | $T,(1),(2), I$ |
| $(4) \quad P \rightarrow(Q \rightarrow S)$ | $P$ |
| $(5)\quad Q \rightarrow S$ | $T,(3),(4), I$ |
| $(6)\quad Q$ | $P$ |
| $(7)\quad S$ | $T,(5),(6), I$ |
| $(8)\quad R \rightarrow S$ | $C P,(1),(7)$ |
- 思考步骤:
- 和 $S$ 相关的条件只有 P \rightarrow(Q \rightarrow S)
- 子公式为 $Q \rightarrow S$,要得到 $S$ 需知道 $Q$,而 $Q$ 为给定前提
- 提取 $Q \rightarrow S$ 需要有 $P$,与 $P$ 有关的条件为 $\neg R \vee P$
- 如果 $\neg R \vee P = P$ 成立,则 $\neg R$ 不成立才可以,则需证明 $R$ 成立
- 而 $R$ 为附加条件
#### 演绎:间接证明法(反证法、归谬法)
- 要证明 $: G_{1}, G_{2}, \cdots, G_{n} \Rightarrow H$
根据判定定理 $:\left(G_{1} \wedge G_{2} \wedge \cdots \wedge G_{n}\right) \rightarrow H$ 为永真公式
即 $: G_{1} \wedge G_{2} \wedge \cdots \wedge G_{n} \wedge \neg H$ 是矛盾式
因此 $: G_{1} \wedge G_{2} \wedge \cdots \wedge G_{n} \wedge \neg H \Rightarrow R \wedge \neg R$
- 方法:将**结论的否定**加入前提集合中,证明出矛盾即可
设前提集合 $\Gamma=\{P \vee Q, P \rightarrow R, Q \rightarrow R\},$ 结论 $H=R_{\circ}$ 证明 $\Gamma \Rightarrow H$
| 结论 | 推理条件和方法 |
| ---------------------------- | ---------------- |
| $(1)\quad$ $\neg R$ | $P$ (附加前提) |
| $(2)\quad$ $P \rightarrow R$ | $P$ |
| $(3)\quad$ $\neg P$ | $T,(1),(2), I$ |
| $(4)\quad$ $Q \rightarrow R$ | $P$ |
| $(5)\quad$ $\neg Q$ | $T,(1),(4), I$ |
| $(6)\quad$ $P \vee Q$ | $P$ |
| $(7)\quad$ $P$ | $T,(5),(6), I$ |
| $(8)\quad$ $P \wedge \neg P$ | $T,(3),(7), I$ |
- 反证法在逻辑推理中有时也十分方便
- 当结论的信息非常丰富(长)的时候,并且此时前提条件非常简单
- 此时将结论取反,作附加前提使用时,无形中增加很多条件
- 然而,总可以不使用它而用规则 $CP$ 证明法来代替它。因为,它实际上本身就是规则 $CP$ 的一种变型。。
### 命题演绎实例
#### 实例一
符号化下面的语句,并使用演绎法证明:
若数 $a$ 是实数,则它不是有理数就是无理数。若 $a$ 不能表示成分数,则它不是有理数。 $a$ 是实数且它不能表示成分数。所以,$a$ 是无理数。
解:
设命题:$P:$ $a$ 是实数
$Q:$ $a$ 是有理数
$R:$ $a$ 是无理数
$S:$ $a$ 能表示成分数
则推理符号化成为:
$P \rightarrow(Q \vee R), \neg S \rightarrow \neg Q, P \wedge \neg S \Rightarrow R$
| 结论 | 推理条件和方法 |
| -------------------------------------- | -------------- |
| $(1)\quad$ $P \wedge \neg S$ | $P$ |
| $(2)\quad$ $P$ | $T,(1), I$ |
| $(3)\quad$ $\neg S$ | $T,(1), I$ |
| $(4)\quad$ $P \rightarrow(Q \vee R)$ | $P$ |
| $(5)\quad$ $Q \vee R$ | $T,(2),(4), I$ |
| $(6)\quad$ $\neg S \rightarrow \neg Q$ | $P$ |
| $(7)\quad$ $\neg Q$ | $T,(3),(6), I$ |
| $(8)\quad$ $R$ | $T,(5),(7), I$ |
- 思维步骤:
- 想得到 $R$,只有 $P \rightarrow(Q \vee R)$ 含有 $R$
- 若 $Q \vee R = R$,则需要得到 $\neg Q$
- 从 $\neg S \rightarrow \neg Q$ 入手,则需要得到 $\neg S$ 成立
- $\neg S$ 在 $P \wedge \neg S$ 中,则 $P$ 和 $\neg S$都成立
#### 实例二
符号化下面的语句,并使用演绎法证明:
如果马会飞或羊吃草,则母鸡就会是飞鸟;如果母鸡是飞鸟,那么烤熟的鸭子还会跑;烤熟的鸭子不会跑。所以羊不吃草。
注意:**推理的有效性和真实性是不同的**,推理本身是有效的,但是结论是错误的。因为前提中有假,但是这并不妨碍我们做正确的推理。
解:
设命题:$P:$ 马会飞
$Q:$ 羊吃草
$R:$ 母鸡是飞鸟
$S:$ 烤熟的鸭子会跑
则推理符号化成为:
$(P \vee Q) \rightarrow R, R \rightarrow S, \neg S \Rightarrow \neg Q$
| 结论 | 推理条件和方法 |
| ------------------------------------- | -------------- |
| $(1)\quad$ $\neg S$ | $P$ |
| $(2)\quad$ $R \rightarrow S$ | $P$ |
| $(3)\quad$ $\neg R$ | $T,(1),(2)I$ |
| $(4)\quad$ $(P \vee Q) \rightarrow R$ | $P$ |
| $(5)\quad$ $\neg(P \vee Q)$ | $T,(3),(4), I$ |
| $(6)\quad$ $\neg P \wedge \neg Q$ | $T .(5), E$ |
| $(7)\quad$ $\neg Q$ | $T,(6), I$ |
- 第五步到第六步为德摩根律的等价变换,为 $E$
- 思维步骤
- $\neg Q$ 无法直接得到,所以需要变形,可以从前提中最简单的前提开始着手
- 最简单的前提为 $\neg S$,与之相关的还有 $R \rightarrow S$
- 由否定后件式可以得到 $\neg R$,而$\neg R$ 和 $(P \vee Q) \rightarrow R$ 可以得到 $\neg(P \vee Q)$ 为真
- $\neg(P \vee Q)$ 根据德摩根律等于 $\neg P \wedge \neg Q$
- 所以 $\neg P$ 和 $\neg Q$ 均成立,所以结论 $\neg Q$ 成立
## 谓词的引入
### 命题逻辑的局限性
- 苏格拉底三段论
- 所有的人都是要死的;苏格拉底是人。所以,苏格拉底是要死的。
- 含变量的语句
- x$ > 3$ $x=y+3;$$x+y=z$ ...等等
- 为了研究简单命题句子内部的逻辑关系,我们需要对简单命题进行分解,利用**个体词、谓词和量词**来描述它们,并研究个体与总体的内在联系和数量关系,这就是**谓词逻辑**或**一阶逻辑**
### 个体词和谓词
#### 简单命题分解
- 命题是具有真假意义的陈述句,从语法上分析,一个陈述句由主语和谓语两部分组成
- 在原子命题中,可以独立存在的客体(**句子中的主语、宾语等**),称为个体词,而用以**刻画客体的性质或客体之间的关系**即是**谓词**
- 考虑如下两个命题
- 陈华是东北师范大学的学生
- 张强是东北师范大学的学生
- 设 $P(x): x$ 是东北师范大学的学生
- 则上述两个句子可写为:
- $\mathrm{P}$(陈华)
- $\mathrm{P}$(张强)
- 语句 “x 大于 3" 可用 $Q(x)$ 表示
- $Q(x)$ 无固定真值,一旦给变量 $x$ 赋一个值,则成为命题,具有一个或真或假的真 值。如 $x=5,$ 则 $Q(5)=1$
- 吾句 " $x=\mathrm{y}+3 "$ 可用 $R(x, y)$ 表示
- $R(x, y)$ 无固定真值,一旦给变量 $x, y$ 赋一个值,则成为命题,具有一个或真或假的 真值。如 $x=5, y=3,$ 则 $R(5,3)=0$
### 个体词
- 个体词可分为两种, 个体常量和个体变量,均在个体域内取值
- 表示具体或特定的个体词称为个体常量。一般用带或不带下标的小写英文字母$a, b, c, \cdots, a_{1}, b_{1}, c_{1}, \cdots$ 等表示
- 表示抽象的或泛指的个体词称为个体变量。一般用带或不带下标的小写英文字母 $x, y, z, \cdots, x_{1}, y_{1}, z_{1}, \cdots$ 等表示
- 个体词的取值范围称为个体域 (或论域),常用 $D$ 表示
- 宇宙间的所有个体域聚集在一起所构成的个体域称为全总个体域。若无特别说明 均使用全总个体域
### 谓词
- 设 $D$ 为非空的个体域,定义在D^n(表示 n 个个体都在个体域 $D$ 上取值) 上取值于 \{0,1\} 上的 n 元 函数,称为 $n$ 元命题函数或 $n$ 元谓词,记为$P\left(x_{1}, x_{2}, \cdots, x_{n}\right)$ 。其中,个体变量 $x_{1}, x_{2}, \cdots, x_{n} \in D$
- 表示具体性质或关系的谓词称为谓词常量
- 表示抽象的或泛指的性质或关系的谓词称为谓词变量
- 谓词均使用大写英文字母 $P, Q, R, \cdots, F, G, H, \cdots$ 来表示
- 小张和小李同岁。可描述为: $F(a, b)$, 其中 $a:$ 小张, $b:$ 小李,这里的 $F$ 是谓词常量
- $x$ 与 $y$ 具有关系 $L$。可描述为 $: L(x, y),$ 这里的 $L$ 是谓词变量
### 复合命题的谓词符号化
- 如果王童是一个三好学生,那么她的学习成绩一定很好
- 设 $S(x): x$ 是一个三好学生
$H(x): x$ 学习成绩好,$a$ : 王童
- 则该命题符号化为 $: S(a) \rightarrow H(a)$
- 李新华是李兰的父亲并且李兰和张三是同班同学
- 设 $F(x, y): x$ 是 $y$ 的父亲
$M(x, y): x$ 与 $y$ 是同班同学
$b:$ 李新华,$c:$ 李兰, $d:$ 张三
- 则该命题符号化为 $: F(b, c) \wedge M(c, d)$
- 北京是中国的首都当且仅当 2 是偶数
- 设 $C(x): x$ 是中国的首都
$E(x): x$ 是偶数
$b:$ 北京,$c:2$
- 则该命题符号化为 $: C(b) \leftrightarrow E(c)$
### 说明和总结
- 谓词中个体词的顺序是十分重要的,不能随意变更。 $F(b, c) \neq F(c, b)$
- 一元谓词用以描述某一个个体的某种**特性**,而 $n$ 元谓词 $(n \geqslant 2)$ 则用以描述 $n$ 个个体之间的**关系**
- 谓词 $P\left(x_{1}, x_{2}, \cdots, x_{n}\right)$ 包含了个体变量,因而本身并不是命题,只有用谓词常量取代 $P$, 用个体常量取代 $x_{1}, x_{2}, \cdots, x_{n}$ 后才会成为命题
- 谓词本身不是命题,下了定义之后才是命题
- 一般将没有任何个体变量的谓词称为 $0$ 元谓词,如$F(a), G(a, b), H\left(a_{1}, a_{2}, \cdots, a_{n}\right)$ 等。当 $F, G, H$ 为谓词常量时, $0$ 元谓词就成为了命题。此时,**命题逻辑中的所有命题都可以表示成 $0$ 元谓词**
## 量词
### 量词的引入
- 虽然目前有了个体词和谓词,但对于有些命题而言,还是无法准确描述
- **所有的**老虎都要吃人
- **每一个**大学生都会说英语
- **有一些**人登上过月球
- **存在**自然数是素数
### 量词的定义
- 全称量词 $(\forall x):$ 所有的 $x ;$ 任意的 $x ;$ 一切的 $x ;$ 每一个 $x; \ldots$
- 存在量词 $(\exists x):$ 有些 $x ;$ 至少有一个 $x ;$ 某一些 $x ;$ 存在 $x ; \cdots$
- 其中的 $x$ 称为作用变量。一般将其量词加在其谓词之前,记为 $(\forall x) F(x), (\exists x) F(x) 。$ 此时,$F(x)$ 称为全称量词和存在量词的辖域
- **所有的**老虎都要吃人
- $\mathrm{P}(x): x$ 要吃人。 $(\forall x) P(x), x \in\{$ 老虎 $\}$
- **每一个**大学生都会说英语
- $Q(x): x$ 会说英语。 $(\forall x) Q(x), x \in\{$ 大学生 $\}$
- **有一些**人登上过月球
- $\mathrm{R}(x): x$ 登上过月球。 $(\exists x) R(x), x \in\{$ 人 $\}$
- **存在**自然数是素数
- $S(x): x$ 是素数 $\alpha(\exists x) S(x), x \in\{$ 自然数 $\}$
### 个体域符号化
- 以上符号化必须要特别注明个体域,在表达比较复杂的命题时会容易混淆。下面引入更准确的表达方式:
- **所有的**老虎都要吃人
- $T(x): x$ 是老虎 $, P(x): x$ 要吃人。 $\quad(\forall x)(T(x) \rightarrow P(x))$
- $x$ 如果是老虎,$x$ 就要吃人
- **每一个**大学生都会说英语
- $C(x): x$ 是大学生 $, Q(x): x$ 会说英语。 $(\forall x)(C(x) \rightarrow Q(x))$
- **有一些**人登上过月球
- $H(x): x$ 是人 $, R(x): x$ 登上过月球。 $(\exists x)(H(x) \wedge R(x))$
- $x$ 是人和 $x$ 登上月球同时成立,至少存在 $x$ 满足两个条件
- **存在**自然数是素数
- $N(x): x$ 是自然数 $, S(x): x$ 是素数。 $(\exists x)(N(x) \wedge S(x))$
- 至少存在一个 $x$ 使得自然数和素数两个条件同时成立
#### 谓词逻辑符号化的两条规则
- 统一个体域为**全总个体域**,而对每一个句子中个体变量的变化范围用一元**特性谓词**刻划之。这种特性谓词在加入到命题函数中时必定遵循如下原则:
- 对于**全称量词** $(\forall x),$ 刻划其对应个体域的特性谓词作为**蕴涵式之前件**加入
- 对于**存在量词** $(\exists x)$, 刻划其对应个体域的特性谓词作为**合取式之合取项**加入
- 上述做法只是通常操作,有时根据用法不同,也可以注明 $x$ 为某特定个体域,如研究微积分时,只研究 $x$ 为实数的情况,这样可以简化问题的推导和处理
### 量词相关的真值确定
“有些同学通过了离散数学考试” 的真值如何确定?
- $(\forall x) G(x):$ 对 $\forall x \in D, G(x)$ 都成立
- $(\forall x) G(x)$ 取值为 1 当且仅当对任意 $x \in D, G(x)$ 都取值为 1
- $(\forall x) G(x)$ 取值为 0 当且仅当存在 $x_{0} \in D,$ 使得 $G\left(x_{0}\right)$ 取值为 0
- $(\exists x) G(x):$ 存在一个 $x_{0} \in D,$ 使得 $G\left(x_{0}\right)$ 成立
- $(\exists x) G(x)$ 取值为 1 当且仅当存在 $x_{0} \in D,$ 使得 $G\left(x_{0}\right)$ 取值为 1
- $(\exists x) G(x)$ 取值为 0 当且仅当对任意 $x \in D, G(x)$ 都取值为 0
#### 实例
设 $\mathrm{P}(x): x$ 是素数 $; \mathrm{I}(x): x$ 是整数 $; \mathrm{Q}(x, y):x+y=0$ 。用语句描述下述句子,并判断其真假值
- $(\forall x)(I(x) \rightarrow P(x))$
- "所有的整数都是素数",真值为假
- $(\exists x)(I(x) \wedge P(x))$
- “存在整数是素数”,真值为真
- $(\forall x)(\forall y)(I(x) \wedge I(y) \rightarrow Q(x, y))$
- "对任意整数 $x,y$,都有$x+y=0$ ",真值为假
- $(\forall x)(I(x) \rightarrow(\exists y)(I(y) \wedge Q(x, y)))$
- "对任意整数 $x$ 都存在整数 $y$,使得 $x+y=0$ ",真值为真
- $(\exists x)(\forall y)(I(x) \wedge(I(y) \rightarrow Q(x, y)))$
- "存在整数 $x$,对任意整数 $y$ 都使得 $x+y=0$ ",真值为假
#### 个体域有限的情况下
- 特别的,当个体域 $D=\left\{x_{0}, x_{1}, \cdots, x_{n}\right\}$ 是有限集合时, $(\forall x) G(x)$ 和 $(\exists x) G(x)$ 的 真值可以用与之等价的命题公式来进行表示。
- $(\forall x) G(x)=G\left(x_{0}\right) \wedge G\left(x_{1}\right) \wedge \cdots \wedge G\left(x_{n}\right)$
- $(\exists x) G(x)=G\left(x_{0}\right) \vee G\left(x_{1}\right) \vee \cdots \vee G\left(x_{n}\right)$
- 设个体域 $D=\{1,2,3,4,5\}, P(x): x$ 是素数,则
- $(\forall x) P(x)=P(1) \wedge P(2) \wedge P(3) \wedge P(4) \wedge P(5)=0 \wedge 1 \wedge 1 \wedge 0 \wedge 1=0$
- $(\exists x) P(x)=P(1) \vee P(2) \vee P(3) \vee P(4) \vee P(5)=0 \vee 1 \vee 1 \vee 0 \vee 1=1$
## 谓词符号化距离
### 谓词逻辑符号化示例一
- **没有人**登上过木星
- 令 $H(x): x$ 是人, $M(x): x$ 登上过木星
- 则命题符号化为 $\neg(\exists x)(H(x) \wedge M(x))$ 或 $(\forall x)(H(x) \rightarrow \neg M(x))$
- 在美国留学的学生**未必**都是亚洲人
- 令 $A(x): x$ 是亚洲人, $H(x): x$ 是在美国留学的学生
- 则命题符号化为 $\neg(\forall x)(H(x) \rightarrow A(x))$ 或 $(\exists x)(H(x) \wedge \neg A(x))$
- 否定“在美国留学的学生都是亚洲人”,或“存在非亚洲人是在美国留学的学生”
- **尽管**有人很聪明,**但**未必一切人都聪明
- 令$M(x): x$ 是人 $; C(x): x$ 很聪明
- 则命题符号化为 $(\exists x)(M(x) \wedge C(x)) \wedge \neg(\forall x)(M(x) \rightarrow C(x))$
- "尽管...但" 是一种合取关系
### 谓词逻辑符号化示例二
- 天下乌鸦一般黑
- 令 $F(x): x$ 是乌鸦 $; G(x, y): x$ 与 $y$ 一般黑
- 则命题符号化为 $(\forall x)(\forall y)(F(x) \wedge F(y) \rightarrow G(x, y))$ 或$\neg(\exists x)(\exists y)(F(x) \wedge F(y) \wedge \neg G(x, y))$
- 任意两只乌鸦都是一样黑的,或不存在两只不一样黑的乌鸦
- 每个实数都存在比它大的另外的实数
- 令 $R(x): x$ 是实数 $; L(x, y):x$ 小于 $y$
- 则命题符号化为 $(\forall x)\left(R(x) \rightarrow(\exists y)(R(y) \wedge L(x, y))\right)$
- 若假定个体域为所有实数,则命题符号化为 $(\forall x)(\exists y) L(x, y)$
- 两次对变元的约束往往与量词的次序有关。不同的量词次序,可以产生不同的真值。因此,当多个量词同时出现时,不能随意颠倒它们的顺序,否则会改变原有的含义。
### 谓词逻辑符号化示例三
符号化下面一组语句:
所有狮子都是凶猛的;有些狮子不喝咖啡;有些凶猛的动物不喝咖啡。
- 令 $\mathrm{P}(\mathrm{x}): \mathrm{x}$ 是狮子 $; Q(\mathrm{x}): \mathrm{x}$ 是凶猛的 $: \mathrm{R}(\mathrm{x}): \mathrm{x}$ 喝咖啡
假定所有动物的集合为个体域 , 则命题符号化为
- $(\forall x)(P(x) \rightarrow Q(x))$
- $(\exists x)(P(x) \wedge \neg R(x))$
- $(\exists x)(Q(x) \wedge \neg R(x))$
所有的个体都是动物,所以令动物的集合为个体域能简化问题
### 谓词逻辑符号化示例四
符号化下面一组句子
所有的蜂鸟都五彩斑斓;没有大鸟以蜜为生;不以蜜为生的鸟都色彩单调;蜂鸟都是小鸟。
- 令$P(x): x$ 是蜂鸟 $; Q(x): x$ 是大鸟 $; R(x): x$ 是以蜜为生的鸟 $; S(x): x$ 五彩斑斓
假定所有鸟的集合为个体域,则命题符号化为
- $(\forall x)(P(x) \rightarrow S(x))$
- $\neg(\exists x)(Q(x) \wedge R(x))$
- $(\forall x)(\neg R(x) \rightarrow \neg S(x))$
- $(\forall x)(P(x) \rightarrow \neg Q(x))$
## 谓词合成公式
### 四类符号
在基于谓词的形式化中,我们将使用如下四种符号:
- 常量符号:指所属个体域 D 中的某个元素,用带或不带下标的小写英文字母 $a, b, c, \cdots, a_{1}, b_{1}, c_{1}, \cdots$ 来表示。
- 变量符号:指所属个体域D 中的任意元素,用带或不带下标的小写英文字母 $x, y, z, \cdots, x_{1}, y_{1}, z_{1}, \cdots$ 来表示。
- 函数符号: $\mathrm{n}$ 元函数符号 $f\left(x_{1}, x_{2}, \cdots, x_{n}\right)$ 可以是所属个体域集合 $D^{n} \rightarrow D$ 的任意一个函数,用带或不带下标的小写英文字母 $f, g, h, \cdots, f_{1}, g_{1}, h_{1}, \cdots$ 来表示。
- 值域是个体域
- 词符号:n 元谓词符号 $P\left(x_{1}, x_{2}, \cdots, x_{n}\right)$ 可以是所属个体域集合 $D^{n} \rightarrow\{0,1\}$ 的任意一个谓词,用带或不带下标的大写英文字母 $P, Q, R, \cdots, P_{1}, Q_{1}, R_{1}, \cdots$ 来表示
- 值域是$\{0,1\}$
#### 实例
命题 “周红的父亲是教授”:
- 若令 $f(x): x$ 的父亲 $; P(x): x$ 是教授 $; c:$ 周红,则该命题符号化为 $P(f(c))$
- 若令 $P(x): x$ 是教授 $; F(x, y): x$ 是 $y$ 的父亲 $; c:$ 周红,则该命题符号化为 $(\forall x)(F(x, c) \rightarrow P(x))$
从上面的例子可以看出,函数可用于表达个体词之间的转换关系,给谓词逻辑中的个体词带来了很大的方便
### 项
- 谓词逻辑中的项 ( Term ),被递归地定义为:
- 任意的常量符号或任意的变量符号是项
- 若 $f\left(x_{1}, x_{2}, \cdots, x_{n}\right)$ 是 $\mathrm{n}$ 元函数符号, $t_{1}, t_{2}, \cdots, t_{n}$ 是项,则 $f\left(t_{1}, t_{2}, \cdots, t_{n}\right)$ 是项
- 仅由有限次使用以上两个规则产生的符号串才是项
- 命题 “周红的父亲是教授" 可表示为 $P(f(c)),$ 这里的 $f(c)$ 是项
- $f(g(x, y), h(a, g(x, y), z))$ 是项
### 合成公式
- 若 $P\left(x_{1}, x_{2}, \cdots, x_{n}\right)$ 是 $\mathrm{n}$ 元谓词, $t_{1}, t_{2}, \cdots, t_{n}$ 是项,则称 $P\left(t_{1}, t_{2}, \cdots, t_{n}\right)$ 为原子谓语公式,简称原子公式。
- 满足下列条件的表达式, 称为合式公式(well-formed formulae/wff),简称公式
- 原子公式是合式公式
- 若 $G, H$ 是合式公式, 则 $(\neg G),(\neg H),(G \vee H),(G \wedge H),(G \rightarrow H),(G \leftrightarrow H)$ 也是合式公式
- 若 $G$ 是合式公式, $x$ 是个体变量,则 $(\forall x) G_{、}(\exists x)$ G也是合式公式
- 由有限次使用以上三个规则产生的表达式才是合式公式
- 公式的最外层括号可省略
- 量词后面的括号省略方式为:一个量词的辖域中仅出现一个原子公式,则此辖域的 外层括号可省略,否则不能省略
- 一个个体词只能受一个量词的约束,否则就是没有意义的
#### 实例
- $- (\forall x)(\exists y)(P(x, y) \rightarrow(Q(x, y) \vee R(x, a, f(z)))),$$(\forall x)(P(x) \rightarrow R(x))$ 等都是公式
- $(\forall x)(P(x) \rightarrow R(x)$,$(\exists y)(\forall x)(\vee P(x, y))$ 等则不是公式
## 自由变元与约束
### 定义
- 给定一个合式公式 $G$, 若变元 $x$ 出现**在使用变元的量词的辖域之内**,则称变元 $x$ 的出现为**约束出现**,此时的变元 $x$ 称为**约束变元**。若 $x$ 的出现**不是约束出现**,则称它为**自由出现**,此时的变元 $x$ 称为自由变元
- 量词辖域的确定
- 若量词后有括号,则括号内的子公式就是该量词的辖域$(\forall x)(\cdots)$
- 若量词后无括号,则与量词邻接的子公式为该量词的辖域。 $(\forall x) F(x)$
### 判定
确定一下公式各两次的辖域以及各个体变量为自由变元还是约束变元
- $(\forall x)(P(x) \rightarrow(\exists y) R(x, y))$
- $P(x)$ 中的 $x,R(x, y)$ 的 $x, y$ 都为约束变元
- $(\exists x) P(x) \wedge Q(x, y)$
- $P(x)$ 中的 $x$ 为约束变元 $, Q(x, y)$ 中的 $x, y$ 是自由变元
- $(\forall x)(\exists y) P(y, z) \vee Q(x, y)) \wedge(\exists x) R(x, y)$
- $P(y, z),Q(x, y)$ 中的 $x, y$ 都为约束变元, $z$ 为自由变元 $; R(x, y)$ 中的 $x$ 为约束变元,$y$ 为自由变元。
- $(\forall x)(P(x) \rightarrow R(x)) \wedge(\exists y) Q(x, y)$
- $P(x), R(x)$ 中的 $x$ 为约束变元 $,Q(x, y)$ 中的 $x$ 为自由变元、$y$ 为约束变元
### 两个规则
- 在上面的公式 $(\forall x)(P(x) \rightarrow R(x)) \wedge(\exists y) Q(x, y)$ 中, $P(x), R(x)$ 中的 $x$ 和 $Q(x, y)$ 中的 $x$ 不同,一个是约束变元,一个是自由变元,二者完全不同
- 为了更明确的区分,我们可以 不同的变量符号来表示,可将公式改为
- $(\forall z)(P(z) \rightarrow R(z)) \wedge(\exists y) Q(x, y)$ 或 $(\forall x)(P(x) \rightarrow$
$R(x)) \wedge(\exists y) Q(z, y)$
- 规则1:约束变元的改名规则
- 将量词中的变元以及该量词辖域中此变量之所有约束出现都用新的个体变元替换
- 新的变元一定要有别于改名辖域中的所有其它变量
- 规则2:自由变元的代入规则
- 将公式中出现该自由变元的每一处都用新的个体变元替换
- 新的变元不允许在原公式中以任何约束形式出现。也可用个体常量代入
- 用个体常量代入后,公式的含义发生了变化,使公式具有普遍意义的变为仅对该个体常量有意义
将公式$(\forall x)(P(x) \rightarrow Q(x, y)) \wedge R(x, y)$ 中的约束变元 $x$ 进行改名
- $(\forall z)(P(z) \rightarrow Q(z, y)) \wedge R(x, y)$
将公式 $(\forall x)(P(x) \rightarrow Q(x, y)) \wedge R(x, y)$ 中的自由变元 $y$ 进行代入
- $(\forall x)(P(x) \rightarrow Q(x, z)) \wedge R(x, z)$
### 闭式
- 设 $G$ 是任意一个公式,若 $G$ 中无自由出现的个体变元,则称 $G$ 为封闭的合式公式,简 称闭式。
- $(\forall x)(P(x) \rightarrow(\exists y) R(x, y))$ 是闭式
- $(\exists x) P(x) \wedge Q(x, y)$ 不是闭式
- 显然,闭式是一个命题
- $x=0$不是一个闭式,所以不是一个命题
## 公式的解释与分类
### 公式的解释
谓词逻辑中公式 $G$ 的每一个解释 $I$ 由如下四部分组成 $:$
- $(\forall x) F(x)$
- 非空的个体域集合 $D$
- $G$ 中的每个常量符号,指定 $D$ 中的某个特定的元素
- $G$ 中的每个 $n$ 元函数符号,指定 $D^{n}$ 到 $D$ 中的某个特定的函数
- $G$ 中的每个 $n$ 元谓词符号,指定 $D^{n}$ 到 \{0,1\} 中的某个特定的谓词
- 规定:公式中无自由变元,或将自由变元看成是常量符号
#### 实例
设有解释 $I$ 为:
- 个体域为 $D=\{\alpha, \beta\}$
- $a$ 指定为: $\alpha$
- $f(\alpha)=\beta, f(\beta)=\alpha$
- $P(\alpha)=1, P(\beta)=0, Q(\alpha, \alpha)=0, Q(\alpha, \beta)=1, Q(\beta, \alpha)=1, Q(\beta, \beta)=1$
判断公式 $(\exists x)(P(f(x)) \wedge Q(x, f(a)))$ 在解释 I 下的真值结果
解:
当 $x=\alpha$ 时 ,
$ P(f(x)) \wedge Q(x, f(a))=P(f(\alpha) \wedge Q(\alpha, f(\alpha)))=P(\beta) \wedge Q(\alpha, \beta)=0 \wedge 1=0$
当 $x=\beta$ 时,
$P(f(x)) \wedge Q(x, f(a))=P(f(\beta) \wedge Q(\beta, f(\alpha)))=P(\alpha) \wedge Q(\beta, \beta)=1 \wedge 1=1$
可见原公式的真值结果为真
### 公式的分类
- 如果公式 $G$ 在它所有的解释下都取值为真,则称 $G$ 为有效公式
- 这里的有效称为“逻辑上的有效”
- $(\forall x)(\forall y)(P(x, y) \wedge Q(x, y) \rightarrow P(x, y))$
- $(\forall x)(\forall y)(\neg P(x, y) \vee P(x, y))$
- 如果公式 $G$ 在它所有的解释下都取值为假,则称 $G$ 为矛盾公式
- $(\forall x)(\forall y)(\neg P(x, y) \wedge P(x, y))$
- 如果至少有一种解释使得公式 $G$ 取值为真,则称 $G$ 为可满足公式
### 公式的判定问题
- 谓词逻辑是不可判定的
- 谓词公式通常无法给出全部的解释
- 只含有一元谓词变项的公式是可判定的
- 如下形式的公式
- $\left(\forall x_{1}\right)\left(\forall x_{2}\right) \cdots\left(\forall x_{n}\right) P\left(x_{1}, x_{2}, \cdots, x_{n}\right)$
- $\left(\exists x_{1}\right)\left(\exists x_{2}\right) \cdots\left(\exists x_{n}\right) P\left(x_{1}, x_{2}, \cdots, x_{n}\right)$
- 若 $P$ 中无量词和其它自由变元时,也是可判定的
- 个体域有穷时的谓词公式是可判定的
## 公式的等价关系
### 等价
- 如果公式 $G \leftrightarrow H$ 是有效公式,则公式 $G, H$ 称为等价的,记为 $G=H$
- 设 $G\left(P_{1}, P_{2}, \cdots, P_{n}\right)$ 是命题演算中的命题公式, $P_{1}, P_{2}, \cdots, P_{n}$ 是出现在 $G$ 中的命题变元,当用 任意的谓词公式 $G_{i}(1 \leqslant i \leqslant n)$ 分别代入 $P_{i}$ 后,得到的新谓词公式 $G\left(G_{1}, G_{2}, \cdots, G_{n}\right)$ 称为原公式 的代入实例
- 永真公式的任意一个代入实例必为有效公式
- 命题演算中的基本等价公式 $E_{1}-E_{24}$ 在谓词演算中仍然成立
### 谓词演算中的基本等价公式
假设 $G(x), H(x)$ 是只含自由变元 $x$ 的公式, $S$ 是不含 $x$ 的公式,则在全总个体域中,有
- 改名规则
$E_{25}:(\exists x) G(x)=(\exists y) G(y)$
$E_{26}:(\forall x) G(x)=(\forall y) G(y)$
- 量词转换率 / 量词否定等价式
$E_{27}: \neg(\exists x) G(x)=\left(\forall^{\prime} x\right) \neg G(x)$
$E_{28}: \neg(\forall x) G(x)=(\exists x) \neg G(x)$
- 量词辖域的扩张和收缩率
$E_{29}:(\forall x)\left(G(x) \vee S\right)=(\forall x) G(x) \vee S$
$E_{30}:(\forall x)(G(x) \wedge S)=(\forall x) G(x) \wedge S$
$E_{31}:(\exists x)(G(x) \vee S)=(\exists x) G(x) \vee S$
$E_{32}:(\exists x)(G(x) \wedge S)=(\exists x) G(x) \wedge S$
- 量词分配率
$\left.E_{33}:(\forall x) ( G(x) \wedge H(x)\right)=(\forall x) G(x) \wedge(\forall x) H(x)$
- "全称量词" 只对 "合取" 满足分配率
$E_{34}:(\exists x)(G(x) \vee H(x))=(\exists x) G(x) \vee(\exists x) H(x) $
- "存在量词" 只对 "析取" 满足分配率
- 改名分配率
$E_{35}:(\forall x) G(x) \vee(\forall x) H(x)=(\forall x)(\forall y)(G(x) \vee H(y))$
- 全称量词 $+$ 析取
$E_{36}:(\exists x) G(x) \wedge(\exists x) H(x)=(\exists x)(\exists y)(G(x) \wedge H(y))$
- 存在量词 $+$ 合取
- 对于多个量词的公式,设 $G(x, y)$ 是含有自由变元 $x, y$ 的谓词公式,则有
$E_{37}:(\forall x)(\forall y) G(x, y)=(\forall y)(\forall x) G(x, y)$
$E_{38}:(\exists x)(\exists y) G(x, y)=(\exists y)(\exists x) G(x, y)$
- 都是全称量词或存在量词,可以交换顺序
#### 实例
设 $P(x): x$ 今天来上课, 个体域为某班全体同学的集合。则
$\neg(\forall x) P(x) :$ 不是所有的同学今天来上课了
$(\exists x) \neg P(x):$ 今天有的同学没来上课
同样,$\neg(\exists x) P(x)$ 与 $(\forall x) \neg P(x)$ 意义也相同
2、设 $G(x): x$ 勤奋学习, $H(x): x$ 喜欢体育活动, 个体域是大学里的学生
$(\forall x)(G(x) \wedge H(x)):$ 大学里所有学生既勤奋学习又喜欢体育活动
$(\forall x) G(x) \wedge(\forall x) H(x):$ 大学所有学生都勤奋学习且大学所有的学生都喜欢体育活动
3、利用谓词之间的等价关系证明 $: \neg(\exists x)(M(x) \wedge F(x))=(\forall x)(M(x) \rightarrow \neg F(x))$
$\neg(\exists x)(M(x) \wedge F(x))$
$=(\forall x) \neg(M(x) \wedge F(x))$
$=(\forall x)(\neg M(x) \vee \neg F(x))$$
$=(\forall x)(M(x) \rightarrow \neg F(x))$
## 前束范式
### 定义
- 在命题逻辑里,每一公式都有与之等值的范式,范式是一种统一的表达形式,当研究一个公式的特点(如永真、永假)时,范式起着重要作用。对谓词逻辑的公式来说,也有范式,其中前束范式与原公式是等值的,而其他范式与原公式只有较弱的关系
- 称公式 $G$ 是一个前束范式,如果 $G$ 中的一切量词都位于该公式的最前端 (不含否定词) 且这些量 词的辖域都延伸到公式的仅端。其标准形式如下:
- $\left(Q_{1} x_{1}\right)\left(Q_{2} x_{2}\right) \cdots\left(Q_{n} x_{n}\right) M\left(x_{1}, x_{2}, \cdots, x_{n}\right)$
- 其中 $Q_{i}$ 为量词 $\forall$ 或 $\exists(i=1, \cdots n), M$ 称作公式 $G$ 的**母式 $($ 基式** $), \mathrm{M}$ 中不再有量词
- 所有的量词都在公式 $M$ 的前面,且不能有否定词
- 前束范式的意思是“在前面进行约束”
### 前束范式的求解步骤
- 消去公式中的联结词“ $\rightarrow$ "," $\leftrightarrow$ " (如果有的话)
- 反复运用量词转换律,德摩根律和双重否定律,直到将所有的“ $\neg$ "都**内移**到原子谓词公式的前端
- $\neg(\exists x) G(x)=(\forall x) \neg G(x) ; \quad \neg(\forall x) G(x)=(\exists x) \neg G(x)$
- 量词转换率
- 在 $G(x)$ 处的 " $\neg$ " 若是合取或析取的形式,使用德摩根律处理
- 使用谓词的等价公式将所有量词提到公式的**最前端**并保证其辖域直到公式的末端
- $(\exists x) G(x)=(\exists y) G(y) ; \quad(\forall x) G(x)=(\forall y) G(y)$
- 改名规则
- $(\forall x)(G(x) \wedge H(x))=(\forall x) G(x) \wedge(\forall x) H(x)\\(\exists x)(G(x) \vee H(x))=(\exists x) G(x) \vee(\exists x) H(x)$
- 量词分配率
- $(\forall x) G(x) \vee(\forall x) H(x)=(\forall x)(\forall y)(G(x) \vee H(y))\\(\exists x) G(x) \wedge(\exists x) H(x)=(\forall x)(\forall y)(G(x) \wedge H(y))$
- $(\forall x)(G(x) \vee S)=(\forall x) G(x) \vee S ; \quad(\forall x)(G(x) \wedge S)=(\forall x) G(x) \wedge S ;$
$(\exists x)(G(x) \vee S)=(\exists x) G(x) \vee S ; \quad(\exists x)(G(x) \wedge S)=(\exists x) G(x) \wedge S .$
- 量词辖域的扩张与收缩率
#### 实例
求 $\neg((\forall x)(\exists y) P(a, x, y) \rightarrow(\exists x)(\neg(\forall y) Q(y, b) \rightarrow R(x)))$ 的前束范式
1. 消去联结词 $" \rightarrow ", " \leftrightarrow ",$ 得
$\neg(\neg(\forall x)(\exists y) P(a, x, y) \vee(\exists x)(\neg \neg(\forall y) Q(y, b) \vee R(x)))$
2. " $\neg$ " 消除和内移,得
$(\forall x)(\exists y) P(a, x, y) \wedge \neg(\exists x)((\forall y) Q(y, b) \vee R(x))$
$=(\forall x)(\exists y) P(a, x, y) \wedge(\forall x)((\exists y) \neg Q(y, b) \wedge \neg R(x))$
3. 量词左移,得
$(\forall x)((\exists y) P(a, x, y) \wedge(\exists y) \neg Q(y, b) \wedge \neg R(x))$
$=(\forall x)((\exists y) P[\vec{\mu}, x, y) \wedge(\exists z) \neg Q(z, b) \wedge \neg R(x))$
$=(\forall x)(\exists y)(\exists z)(P(a, x, y) \wedge \neg Q(z, b) \wedge \neg R(x))$
$=(\forall x)(\exists y)(\exists z) S(a, b, x, y, z)$
即 $:((\forall x)(\exists y)(\exists z) S(a, b, x, y, z)$ 为原公式的前束范式,这里 $S(a, b, x, y, z)=P(a, x, y) \wedge \neg Q(z, b) \wedge \neg R(x)$是母式
## 推理形式和推理规则
### 推理形式
- 设 $G_{1}, G_{2}, \cdots, G_{n}, H$ 是公式,称 $H$ 是 $G_{1}, G_{2}, \cdots, G_{n}$ 的逻辑结果(或称 $G_{1}, G_{2}, \cdots, G_{n}$ 共同蕴涵 $H)$ 当且仅当对任意解释 $I,$ 若 $I$ 同时满足 $G_{1}, G_{2}, \cdots, G_{n},$ 则 $I$ 满足 $H$
- 记为 $G_{1}, G_{2}, \cdots, G_{n} \Rightarrow H,$ 此时称 $G_{1}, G_{2}, \cdots, G_{n} \Rightarrow H$ 是有效的,否则称为无效的
- $G_{1}, G_{2}, \cdots, G_{n}$称为一组前提 (premise),有时用集合 $\Gamma$ 来表示
- 记 $\Gamma=\left\{G_{1}, G_{2}, \cdots, G_{n}\right\}, H$ 称为结论 (conclusion),又称 $\mathrm{H}$ 是前提集合 $\Gamma$ 的逻辑结果,记为 $\Gamma \Rightarrow H_{0}$
- 设 $G_{1}, G_{2}, \cdots, G_{n}, H$ 是公式,公式 $H$ 是前提集合 $\Gamma=\left\{G_{1}, G_{2}, \cdots, G_{n}\right\}$ 的逻辑结果当且仅 当 $G_{1} \wedge G_{2} \wedge \cdots \wedge G_{n} \rightarrow H$ 为有效公式
- 根据代入实例的特性,命题演算中的基本蕴涵公式 $I_{1}-l_{11}$ 在谓词演算中仍然成立
### 推理规律
假设 $G(x), H(x)$ 是只含自由变元 $x$ 的公式,则在全总个体域中,有
- $I_{12}:(\forall x) G(x) \Rightarrow(\exists x) G(x)$
- $I_{13}:(\forall x) G(x) \vee(\forall x) H(x) \Rightarrow(\forall x)(G(x) \vee H(x))$
$I_{14}:(\exists x)(G(x) \wedge H(x)) \Rightarrow(\exists x) G(x) \wedge(\exists x) H(x)$
- 注意 "全称量词" 对 "析取“,"存在量词" 对 "合取"
- $I_{15}:(\forall x)(G(x) \rightarrow H(x)) \Rightarrow(\forall x) G(x) \rightarrow(\forall x) H(x)$
$I_{16}:(\forall x)(G(x) \rightarrow H(x)) \Rightarrow(\exists x) G(x) \rightarrow(\exists x) H(x)$
对于多个量词的公式,设 $G(x, y)$ 是含有自由变元 $x, y$ 的谓词公式,则有
- $I_{17}:(\exists x)(\forall y) G(x, y) \Rightarrow(\forall y)(\exists x) G(x, y)$
$I_{18}:(\forall x)(\forall y) G(x, y) \Rightarrow(\exists y)(\exists x) G(x, y)$
$I_{19}:(\forall y)(\forall x) G(x, y) \Rightarrow(\exists x)(\forall y) G(x, y)$
$I_{20}:(\exists y)(\forall x) G(x, y) \Rightarrow(\forall x)(\exists y) G(x, y)$
$I_{21}:(\forall x)(\exists y) G(x, y) \Rightarrow(\exists y)(\exists x) G(x, y)$
$I_{22}:(\forall y)(\exists x) G(x, y) \Rightarrow(\exists x)(\exists y) G(x, y)$
### 推理规则
#### 全称特指规则
- US ( 全称特指规则)
- $(\forall x) G(x)_{1} \Rightarrow G(y)$, $y$ 不在 $G(x)$ 中约束出现
- 或 $:(\forall x) G(x) \Rightarrow G(c)$, $c$ 为**任意**个体常量
设实数集中,语句 “不存在最大的实数" 可符号化为 $:(\forall x)(\exists y) G(x, y) $
其中 $: G(x, y): y>x$
如下推导正确吗?为什么?
|步骤|条件|
|-|-|
| (1) $(\forall x)(\exists y) G(x, y)$ | $P$ |
| (2) $(\exists y) G(y, y)$ | $US,(1)$ |
解: 以上推导不正确,正确的推导应为:
|步骤|条件|
|-|-|
| (1) $(\forall x)(\exists y) G(x, y)$ | $P$ |
| (2) $(\exists y) G(z, y)$ | $US,(1)$ |
#### 存在特指规则
- ES ( 存在特指规则 )
- $(\exists x) G(x) \Rightarrow G(c), c$ 为使得 $G(c)$ 为 ”**真**“ 的**特定**的个体常量
- 当 $G(x)$ 中还有除 $x$ 之外的自由变元,则必须用关于这些变元的函数符号来取代 $c$
设实数集中,语句 “不存在最大的实数" 可符号化为 $:(\forall x)(\exists y) G(x, y) $
其中 $: G(x, y): y>x$
如下推导正确吗?为什么?
|步骤|条件|
|-|-|
| (1) $(\forall x)(\exists y) G(x, y)$ | $P$ |
| (2) $(\exists y) G(z, y)$ | $US,(1)$ |
|(3) $G(z, c)$|$ES,(2)$|
解: 以上推导不正确,正确的推导应为:
|步骤|条件|
|-|-|
| (1) $(\forall x)(\exists y) G(x, y)$ | $P$ |
| (2) $(\exists y) G(z, y)$ | $US,(1)$ |
|(3) $G(z, f(z))$|$ES,(2)$|
- $z$ 改变时, $y$ 应随着 $z$ 而改变,而不是一个常量
#### 全称推广规则
- UG(全称推广规则$: G(y) \Rightarrow(\forall x) G(x), G(y)$ 中无变元 $x$
设实数集中,语句 “不存在最大的实数" 可符号化为 $:(\forall x)(\exists y) G(x, y) $
其中 $: G(x, y): y>x$
如下推导正确吗?为什么?
|步骤|条件|
|-|-|
| (1) $(\forall x)(\exists y) G(x, y)$ | $P$ |
| (2) $(\exists y) G(z, y)$ | $US,(1)$ |
|(3) $(\forall y)(\exists y) G(y, y)$|$UG,(2)$|
解: 以上推导不正确,正确的推导应为:
|步骤|条件|
|-|-|
| (1) $(\forall x)(\exists y) G(x, y)$ | $P$ |
| (2) $(\exists y) G(z, y)$ | $US,(1)$ |
|(3) $(\forall z)(\exists y) G(z, y)$|$ES,(2)$|
- $y$ 已经出现过,不能再次用 $y$ 换元,应用不相关的字母
#### 存在推广规则
- EG ( 存在推广规则)
- $G(c) \Rightarrow(\exists x) G(x),$ c 为特定个体常量
- 或 $: G(y) \Rightarrow(\exists x) G(x), G(y)$ 中无变元 $x$
设 $: G(x, y): y>x$
如下推导正确吗?为什么 ?
| 步骤 | 条件 |
| ------------------------ | -------- |
| (1)$G(x,c)$ | $P$ |
| (2)$(\exists x) G(x, x)$ | $EG,(1)$ |
解: 以上推导不正确,正确的推导应为:
| 步骤 | 条件 |
| ------------------------ | -------- |
| (1)$G(x,c)$ | $P$ |
| (2)$(\exists y) G(x, y)$ | $EG,(1)$ |
## 综合推理方法 | 28.229433 | 497 | 0.525208 | yue_Hant | 0.759364 |
21aab24c1ed62fa31ff72bbcc1eabb58dfc56d77 | 1,667 | md | Markdown | catalog/beelzebub-maou-gaiden/en-US_beelzebub-manga.md | htron-dev/baka-db | cb6e907a5c53113275da271631698cd3b35c9589 | [
"MIT"
] | 3 | 2021-08-12T20:02:29.000Z | 2021-09-05T05:03:32.000Z | catalog/beelzebub-maou-gaiden/en-US_beelzebub-manga.md | zzhenryquezz/baka-db | da8f54a87191a53a7fca54b0775b3c00f99d2531 | [
"MIT"
] | 8 | 2021-07-20T00:44:48.000Z | 2021-09-22T18:44:04.000Z | catalog/beelzebub-maou-gaiden/en-US_beelzebub-manga.md | zzhenryquezz/baka-db | da8f54a87191a53a7fca54b0775b3c00f99d2531 | [
"MIT"
] | 2 | 2021-07-19T01:38:25.000Z | 2021-07-29T08:10:29.000Z | # Beelzebub

- **type**: manga
- **volumes**: 28
- **chapters**: 250
- **original-name**: べるぜバブ
- **start-date**: 2009-02-23
- **end-date**: 2009-02-23
## Tags
- action
- comedy
- demons
- school
- shounen
- supernatural
## Authors
- Tamura
- Ryuuhei (Story & Art)
## Sinopse
Ishiyama High—land of the delinquents. No matter how tough you think you are, you don't mess with an Ishi-high student. One of the most infamous students is first year Tatsumi Oga, called "Demon King" by those he's defeated.
One day, while Oga is finishing off a gang that attacked him, he stumbles upon an infant. In a rare moment of kindness, Oga tries to care for the baby. As a result, the child becomes overly attached to him. In a panic, he brings the baby to his best friend Takayuki Furuichi's house. While there, they are attacked by Hildegard, a demon maid who says the baby is actually Kaiser de Emperana Beelzebub IV. She also reveals that he was sent here to destroy humanity, on a whim of his father Beelzebub III, Great Demon Lord of the Demon World. After failing to remove Baby Beel from Oga, she declares that he must raise him, as moving more than 15 meters away will result in instant death for Oga.
With his delinquent past and unfriendly nature, Oga must deal with the burden of raising a demon baby. Beezelbub follows his story through encounters with demons and dangerous classmates, all while being an unorthodox role model and carer for the young Baby Beel.
[Source My Anime List]
## Links
- [My Anime list](https://myanimelist.net/manga/10010/Beelzebub)
| 42.74359 | 694 | 0.738452 | eng_Latn | 0.997172 |
21aaf3d6c2d065db697cde688b78f630a530d2c0 | 1,334 | md | Markdown | jsblast/main/02Number.md | azu/slide | d9b6822c11dd1300c6de8f35d2472d1d690ea310 | [
"MIT"
] | 15 | 2015-02-20T07:18:03.000Z | 2020-10-30T20:45:31.000Z | jsblast/main/02Number.md | azu/slide | d9b6822c11dd1300c6de8f35d2472d1d690ea310 | [
"MIT"
] | 16 | 2015-01-27T17:39:36.000Z | 2018-08-21T11:03:23.000Z | jsblast/main/02Number.md | azu/slide | d9b6822c11dd1300c6de8f35d2472d1d690ea310 | [
"MIT"
] | 3 | 2016-01-16T17:33:45.000Z | 2019-02-22T04:00:52.000Z | # Number オブジェクト
- 全ての数字64bit float point("double")
- Number.MAX_VALUE = 1.7976931348623157e+308
Number.MIN_VALUE = 5e-324
- Number.toString(basis)でn進数で表現できる。
basisは基数
例えば
var a = 100;
a.toString(16) "64"
a.toString(2) "1100100"
---
# Heads up: parseInt と 8進数
- 0x のprefixは16進数 e.g. 0xFF
- 0 のprefixは8進数 e.g. 014 (10進数で12)
---
## parseInt()の使用上の注意
- parseInt("08")// 0
- 期待とは異なる
- 不正な文字が来るまでパースする
- parseInt(08) // Error: 08 is not a legal ECMA-262 octal constantb
いつも第二引数の基数を指定しておくと安心
- parseInt("08", 10); // 8
---
# Math Object
Mathオブジェクトに便利なメソッド
- Math.PI 3.141592653589793
- Math.E 2.718281828459045
- Math.SQRT2 1.4142135623730951
- Math.LN2 0.6931471805599453
- Math.LN10 2.302585092994046
- Math.random() 0以上1未満の数値がランダム
- Math.round(), Math.floor(), Math.ceil()
- Math.sin(), Math.cos(), Math.atan()
- Math.min(), Math.max()
- Math.pow(), Math.sqrt()
---
# NaN = Not a Number
- 何に対してもイコールではない(NaN自体に対しても)
- [通常の数値かどうかはisNaN関数じゃなくてisFinite関数 - 三等兵](http://d.hatena.ne.jp/sandai/20100206/p1 "通常の数値かどうかはisNaN関数じゃなくてisFinite関数 - 三等兵")
- [_.isNaN でドジをした - わからん](http://d.hatena.ne.jp/kitokitoki/20110607/p2 "_.isNaN でドジをした - わからん")
- isNaN(NaN);// true
- typeof NaN;// "number"
# Infinity
- 1/0 -> Infinity
- Infinity > Number.MAX_VALUE
- 1/Infinity -> 0
- Infinity/Infinity -> NaN(Not a Number)
| 20.84375 | 126 | 0.691154 | yue_Hant | 0.679327 |
21ab33a6648d4af12da09194ee887464b06f0cd2 | 263 | md | Markdown | docs/index.md | spyder007/pi-monitoring | fab660adcf6ed89a591a6ed2060d653369843e6e | [
"MIT"
] | null | null | null | docs/index.md | spyder007/pi-monitoring | fab660adcf6ed89a591a6ed2060d653369843e6e | [
"MIT"
] | null | null | null | docs/index.md | spyder007/pi-monitoring | fab660adcf6ed89a591a6ed2060d653369843e6e | [
"MIT"
] | null | null | null | # PI Monitoring Documentation
## About The Project
PI Monitoring is a set of Python scripts used to monitor web sites and report incidents and operational status to statuspage.io
### Built With
* [Python](https://www.python.org/)
<!-- GETTING STARTED -->
| 17.533333 | 127 | 0.730038 | eng_Latn | 0.848149 |
21abd26c6978424331fd9301445d6ba2d62d413f | 45 | md | Markdown | app/utils/readme.md | OriginalSin/winnie | 9515bb1e325e4a1988130187dc9da690c3351896 | [
"Apache-2.0"
] | null | null | null | app/utils/readme.md | OriginalSin/winnie | 9515bb1e325e4a1988130187dc9da690c3351896 | [
"Apache-2.0"
] | null | null | null | app/utils/readme.md | OriginalSin/winnie | 9515bb1e325e4a1988130187dc9da690c3351896 | [
"Apache-2.0"
] | 2 | 2016-11-10T06:49:53.000Z | 2018-04-24T13:44:07.000Z | utility functions both for editor and viewer
| 22.5 | 44 | 0.844444 | eng_Latn | 0.999801 |
21acd910539b35bd4dc7710494f011315ba50e85 | 2,244 | md | Markdown | tests/unit/test_data/en_tw-wa/en_tw/bible/other/tongue.md | linearcombination/DOC | 4478e55ec81426c15a2c402cb838e76d79741c03 | [
"MIT"
] | 1 | 2022-01-10T21:03:26.000Z | 2022-01-10T21:03:26.000Z | tests/unit/test_data/en_tw-wa/en_tw/bible/other/tongue.md | linearcombination/DOC | 4478e55ec81426c15a2c402cb838e76d79741c03 | [
"MIT"
] | 1 | 2022-03-28T17:44:24.000Z | 2022-03-28T17:44:24.000Z | tests/unit/test_data/en_tw-wa/en_tw/bible/other/tongue.md | linearcombination/DOC | 4478e55ec81426c15a2c402cb838e76d79741c03 | [
"MIT"
] | 3 | 2022-01-14T02:55:44.000Z | 2022-02-23T00:17:51.000Z | # tongue
## Related Ideas:
language
## Definition:
There are several figurative meanings of "tongue" in the Bible.
* In the Bible, the most common figurative meaning for this term is "language" or "speech."
* Sometimes "tongue" may refer to a human language spoken by a certain people group.
* Other times it refers to a supernatural language that the Holy Spirit gives believers in Christ as one of the "gifts of the Spirit."
* The expression "tongues" of fire refers to "flames" of fire.
* In the expression "my tongue rejoices," the term "tongue" refers to the whole person. (See: [[rc://en/ta/man/jit/figs-synecdoche]])
* The phrase "lying tongue" refers to a person's voice or speech. (See: [metonymy](rc://en/ta/man/jit/figs-metonymy))
## Picture of a Tongue:
<a href="https://content.bibletranslationtools.org/WycliffeAssociates/en_tw/raw/branch/master/PNGs/t/Tongue.png"><img src="https://content.bibletranslationtools.org/WycliffeAssociates/en_tw/raw/branch/master/PNGs/t/Tongue.png" ></a>
## Translation Suggestions
* Depending on the context, the term "tongue" can be translated by "language" or "spiritual language." If it is not clear which one it is referring to, it is better to translate it as "language."
* When referring to fire, this term could be translated as "flames."
* The expression "my tongue rejoices" could be translated as "I rejoice and praise God" or "I am joyfully praising God."
* The phrase, "tongue that lies" could be translated as "person who tell lies" or "people who lie."
* Phrases such as "with their tongues" could be translated as "with what they say" or "by their words."
(See also: [gift](../kt/gift.md), [Holy Spirit](../kt/holyspirit.md), [joy](../other/joy.md), [praise](../other/praise.md), [rejoice](../other/joy.md), [spirit](../kt/spirit.md))
## Bible References:
* [1 Corinthians 12:10](rc://en/tn/help/1co/12/10)
* [1 John 03:18](rc://en/tn/help/1jn/03/18)
* [2 Samuel 23:02](rc://en/tn/help/2sa/23/02)
* [Acts 02:26](rc://en/tn/help/act/02/26)
* [Ezekiel 36:03](rc://en/tn/help/ezk/36/03)
* [Philippians 02:11](rc://en/tn/help/php/02/11)
## Word Data:
* Strong's: H3956, G1100, G1258, G2084, G5456
## Forms Found in the English ULB:
language, languages, tongue, tongues
| 44 | 232 | 0.717914 | eng_Latn | 0.985211 |
21ad024aa96b0057f8d65be7502de800ad9fa2a4 | 3,723 | md | Markdown | docs/ide/reference/my-extensions-page-project-designer-visual-basic.md | MicrosoftDocs/visualstudio-docs.de-de | edda581743b0eede0b99441d8e52a8d0e133dec8 | [
"CC-BY-4.0",
"MIT"
] | 10 | 2018-09-27T09:13:44.000Z | 2021-09-08T07:12:47.000Z | docs/ide/reference/my-extensions-page-project-designer-visual-basic.md | MicrosoftDocs/visualstudio-docs.de-de | edda581743b0eede0b99441d8e52a8d0e133dec8 | [
"CC-BY-4.0",
"MIT"
] | 68 | 2018-02-07T12:07:58.000Z | 2021-03-19T00:35:58.000Z | docs/ide/reference/my-extensions-page-project-designer-visual-basic.md | MicrosoftDocs/visualstudio-docs.de-de | edda581743b0eede0b99441d8e52a8d0e133dec8 | [
"CC-BY-4.0",
"MIT"
] | 41 | 2018-01-05T16:53:02.000Z | 2021-10-09T11:00:50.000Z | ---
title: Seite "My-Erweiterungen", Projekt-Designer (Visual Basic)
description: Hier erfahren Sie, wie Sie über die Seite „My-Erweiterungen“ des Projekt-Designers My-Namespaceerweiterungen in Ihrem Projekt verwalten.
ms.custom: SEO-VS-2020
ms.date: 11/04/2016
ms.topic: reference
f1_keywords:
- vb.ProjectPropertiesMyExtensions
helpviewer_keywords:
- Project Designer, My Extensions page
- My Extensions page in Project Designer
ms.assetid: 2f08494e-84c1-444b-872b-900fbbcf0364
author: TerryGLee
ms.author: tglee
manager: jmartens
ms.technology: vs-ide-general
ms.workload:
- multiple
ms.openlocfilehash: 8fec9e648a1f17cf4023aac0d7bc6034bfa7211c
ms.sourcegitcommit: 68897da7d74c31ae1ebf5d47c7b5ddc9b108265b
ms.translationtype: HT
ms.contentlocale: de-DE
ms.lasthandoff: 08/13/2021
ms.locfileid: "122151248"
---
# <a name="my-extensions-page-project-designer-visual-basic"></a>Seite "My-Erweiterungen", Projekt-Designer (Visual Basic)
Verwenden Sie die Seite **My-Erweiterungen** des **Projekt-Designers**, um `My`-Namespaceerweiterungen in Ihrem Projekt zu verwalten. Mithilfe von `My`-Namespaceerweiterungen können Sie den `My`-Namespace so anpassen, dass Ihre eigenen benutzerdefinierten Member hinzugefügt werden. Informationen zum Erstellen der benutzerdefinierten `My`-Namespaceerweiterungen finden Sie unter [Erweitern des My-Namespaces in Visual Basic](/dotnet/visual-basic/developing-apps/customizing-extending-my/extending-the-my-namespace).
Um auf die Seite **My-Erweiterungen** zuzugreifen, doppelklicken Sie für Ihren Projektknoten im **Projektmappen-Explorer** auf **Mein Projekt**. Sobald der **Projekt-Designer** angezeigt wird, klicken Sie auf die Registerkarte **My-Erweiterungen**.
## <a name="uielement-list"></a>UIElement-Liste
Mithilfe der folgenden Optionen können Sie die `My`-Namespaceerweiterungen Ihrem Projekt hinzufügen oder aus diesem entfernen. Eine `My`-Namespaceerweiterung muss zunächst als Visual Studio-Elementvorlage installiert werden, damit sie verfügbar ist und hinzugefügt werden kann. Informationen zur Veröffentlichung und Installation von `My`-Namespaceerweiterungen finden Sie unter [Packen und Bereitstellen von benutzerdefinierten My-Erweiterungen](/dotnet/visual-basic/developing-apps/customizing-extending-my/packaging-and-deploying-custom-my-extensions).
**My-Namespaceerweiterungen**
In dieser Liste sind alle in dem Projekt installierten `My`-Namespaceerweiterungen aufgelistet.
**Erweiterung hinzufügen**
Klicken Sie auf diese Schaltfläche, um Ihrem Projekt eine installierte `My`-Namespaceerweiterung hinzuzufügen. Es wird eine Liste aller möglichen `My`-Namespaceerweiterungen angezeigt. Wählen Sie die `My`-Namespaceerweiterung aus, die Sie Ihrem Projekt hinzufügen möchten, und klicken Sie auf **OK**, um sie hinzuzufügen.
**Erweiterung entfernen**
Wählen Sie mindestens einen Verweis in der Liste **My-Namespaceerweiterungen** aus, und klicken Sie dann auf diese Schaltfläche, um die `My`-Namespaceerweiterung aus dem Projekt zu entfernen.
## <a name="see-also"></a>Siehe auch
- [Extending the My Namespace in Visual Basic (Erweitern des Namespaces „My“ in Visual Basic)](/dotnet/visual-basic/developing-apps/customizing-extending-my/extending-the-my-namespace)
- [Packen und Bereitstellen von benutzerdefinierten My-Erweiterungen](/dotnet/visual-basic/developing-apps/customizing-extending-my/packaging-and-deploying-custom-my-extensions)
- [Erweitern des Visual Basic-Anwendungsmodells](/dotnet/visual-basic/developing-apps/customizing-extending-my/extending-the-visual-basic-application-model)
- [Anpassen der verfügbaren Objekte in „My“](/dotnet/visual-basic/developing-apps/customizing-extending-my/customizing-which-objects-are-available-in-my)
| 71.596154 | 555 | 0.815203 | deu_Latn | 0.968638 |
21ae65f75f5d533cf3d01648533d4403ae33496b | 5,567 | md | Markdown | sccc-vue/README.md | SpringCloud/spring-cloud-cms | e29351a07f60e9ff6b347401ae3e90638d31bb6a | [
"MIT"
] | 7 | 2018-03-22T02:34:56.000Z | 2019-05-15T13:14:27.000Z | sccc-vue/README.md | SpringCloud/spring-cloud-cms | e29351a07f60e9ff6b347401ae3e90638d31bb6a | [
"MIT"
] | null | null | null | sccc-vue/README.md | SpringCloud/spring-cloud-cms | e29351a07f60e9ff6b347401ae3e90638d31bb6a | [
"MIT"
] | 1 | 2018-04-19T05:02:24.000Z | 2018-04-19T05:02:24.000Z | # manage-system #
基于Vue.js 2.x系列 + Element UI 的后台管理系统解决方案。[线上地址](http://blog.gdfengshuo.com/example/work/)
[English document](https://github.com/lin-xin/manage-system/blob/master/README_EN.md)
## 前言 ##
之前在公司用了Vue + Element组件库做了个后台管理系统,基本很多组件可以直接引用组件库的,但是也有一些需求无法满足。像图片裁剪上传、富文本编辑器、图表等这些在后台管理系统中很常见的功能,就需要引用其他的组件才能完成。从寻找组件,到使用组件的过程中,遇到了很多问题,也积累了宝贵的经验。所以我就把开发这个后台管理系统的经验,总结成这个后台管理系统解决方案。
该方案作为一套多功能的后台框架模板,适用于绝大部分的后台管理系统(Web Management System)开发。基于vue.js,使用vue-cli脚手架快速生成项目目录,引用Element UI组件库,方便开发快速简洁好看的组件。分离颜色样式,支持手动切换主题色,而且很方便使用自定义主题色。
## 功能 ##
- [x] Element UI
- [x] 登录/注销
- [x] 表格
- [x] 表单
- [x] 图表 :bar_chart:
- [x] 富文本编辑器
- [x] markdown编辑器
- [x] 图片拖拽/裁剪上传
- [x] 支持切换主题色 :sparkles:
- [x] 列表拖拽排序
## 目录结构介绍 ##
|-- build // webpack配置文件
|-- config // 项目打包路径
|-- src // 源码目录
| |-- components // 组件
| |-- common // 公共组件
| |-- Header.vue // 公共头部
| |-- Home.vue // 公共路由入口
| |-- Sidebar.vue // 公共左边栏
| |-- page // 主要路由页面
| |-- BaseCharts.vue // 基础图表
| |-- BaseForm.vue // 基础表单
| |-- BaseTable.vue // 基础表格
| |-- Login.vue // 登录
| |-- Markdown.vue // markdown组件
| |-- Readme.vue // 自述组件
| |-- Upload.vue // 图片上传
| |-- VueEditor.vue // 富文本编辑器
| |-- VueTable.vue // vue表格组件
| |-- App.vue // 页面入口文件
| |-- main.js // 程序入口文件,加载各种公共组件
|-- .babelrc // ES6语法编译配置
|-- .editorconfig // 代码编写规格
|-- .gitignore // 忽略的文件
|-- index.html // 入口html文件
|-- package.json // 项目及工具的依赖配置文件
|-- README.md // 说明
## 安装步骤 ##
git clone https://github.com/lin-xin/manage-system.git // 把模板下载到本地
cd manage-system // 进入模板目录
npm install // 安装项目依赖,等待安装完成之后
## 本地开发 ##
// 开启服务器,浏览器访问 http://localhost:8080
npm run dev
## 构建生产 ##
// 执行构建命令,生成的dist文件夹放在服务器下即可访问
npm run build
## 组件使用说明与演示 ##
### vue-schart ###
vue.js封装sChart.js的图表组件。访问地址:[vue-schart](https://github.com/linxin/vue-schart)
<p><a href="https://www.npmjs.com/package/vue-schart"><img src="https://img.shields.io/npm/dm/vue-schart.svg" alt="Downloads"></a></p>
```JavaScript
<template>
<div>
<schart :canvasId="canvasId"
:type="type"
:width="width"
:height="height"
:data="data"
:options="options"
></schart>
</div>
</template>
<script>
import Schart from 'vue-schart'; // 导入Schart组件
export default {
data: function(){
return {
canvasId: 'myCanvas', // canvas的id
type: 'bar', // 图表类型
width: 500,
height: 400,
data: [
{name: '2014', value: 1342},
{name: '2015', value: 2123},
{name: '2016', value: 1654},
{name: '2017', value: 1795},
],
options: { // 图表可选参数
title: 'Total sales of stores in recent years'
}
}
},
components: {
Schart
}
}
</script>
```
### element-ui ###
一套基于vue.js2.0的桌面组件库。访问地址:[element](http://element.eleme.io/#/zh-CN/component/layout)
### vue-datasource ###
一个用于动态创建表格的vue.js服务端组件。访问地址:[vue-datasource](https://github.com/coderdiaz/vue-datasource)
### Vue-Quill-Editor ###
基于Quill、适用于Vue2的富文本编辑器。访问地址:[vue-quill-editor](https://github.com/surmon-china/vue-quill-editor)
### Vue-SimpleMDE ###
Vue.js的Markdown Editor组件。访问地址:[Vue-SimpleMDE](https://github.com/F-loat/vue-simplemde)
### Vue-Core-Image-Upload ###
一款轻量级的vue上传插件,支持裁剪。访问地址:[Vue-Core-Image-Upload](https://github.com/Vanthink-UED/vue-core-image-upload)
## 其他注意事项 ##
### 一、如果我不想用到上面的某些组件呢,那我怎么在模板中删除掉不影响到其他功能呢? ###
举个栗子,我不想用 vue-datasource 这个组件,那我需要分四步走。
第一步:删除该组件的路由,在目录 src/router/index.js 中,找到引入改组件的路由,删除下面这段代码。
```JavaScript
{
path: '/vuetable',
component: resolve => require(['../components/page/VueTable.vue'], resolve) // vue-datasource组件
},
```
第二步:删除引入该组件的文件。在目录 src/components/page/ 删除 VueTable.vue 文件。
第三步:删除该页面的入口。在目录 src/components/common/Sidebar.vue 中,找到该入口,删除下面这段代码。
```HTML
<el-menu-item index="vuetable">Vue表格组件</el-menu-item>
```
第四步:卸载该组件。执行以下命令:
npm un vue-datasource -S
完成。
### 二、如何切换主题色呢? ###
第一步:打开 src/main.js 文件,找到引入 element 样式的地方,换成浅绿色主题。
```javascript
import 'element-ui/lib/theme-default/index.css'; // 默认主题
// import '../static/css/theme-green/index.css'; // 浅绿色主题
```
第二步:打开 src/App.vue 文件,找到 style 标签引入样式的地方,切换成浅绿色主题。
```javascript
@import "../static/css/main.css";
@import "../static/css/color-dark.css"; /*深色主题*/
/*@import "../static/css/theme-green/color-green.css"; !*浅绿色主题*!*/
```
第三步:打开 src/components/common/Sidebar.vue 文件,找到 el-menu 标签,把 theme="dark" 去掉即可。
## 项目截图 ##
### 默认皮肤 ###

### 浅绿色皮肤 ###

| 28.84456 | 183 | 0.534938 | yue_Hant | 0.530117 |
21ae7db5e3ce3903b8a3d47bbdde12734ce60d6f | 4,900 | md | Markdown | docset/windows/appbackgroundtask/Get-AppBackgroundTask.md | skyguy94/windows-powershell-docs | d891dd308ed689225008966aeb41b3c75176fec2 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docset/windows/appbackgroundtask/Get-AppBackgroundTask.md | skyguy94/windows-powershell-docs | d891dd308ed689225008966aeb41b3c75176fec2 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docset/windows/appbackgroundtask/Get-AppBackgroundTask.md | skyguy94/windows-powershell-docs | d891dd308ed689225008966aeb41b3c75176fec2 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
ms.mktglfcycl: manage
ms.sitesec: library
ms.author: v-kaunu
author: Kateyanne
description: Use this topic to help manage Windows and Windows Server technologies with Windows PowerShell.
external help file: PS_BackgroundTask.cdxml-help.xml
keywords: powershell, cmdlet
manager: jasgro
ms.date: 12/20/2016
ms.prod: w10
ms.technology:
ms.topic: reference
online version:
schema: 2.0.0
title: Get-AppBackgroundTask
ms.reviewer:
ms.assetid: 937AA4D9-7BB6-45CF-9AB0-ED45BFEC1725
---
# Get-AppBackgroundTask
## SYNOPSIS
Gets background task information.
## SYNTAX
```
Get-AppBackgroundTask [-PackageFamilyName <String>] [-IncludeResourceUsage] [-CimSession <CimSession[]>]
[-ThrottleLimit <Int32>] [-AsJob] [<CommonParameters>]
```
## DESCRIPTION
The **Get-AppBackgroundTask** cmdlet gets background task information for a task specified in the *PackageFamilyName* parameter.
A background task performs an activity for an application, such as downloading a file.
You must have administrator access to get background task information.
## EXAMPLES
### Example 1: Display background tasks
```
PS C:\> Get-AppBackgroundTask -PackageFamilyName "Microsoft.BingSports_8wekyb3d8bbwe"
```
This command displays the registered background tasks that belong to the Microsoft.BingSports_8wekyb3d8bbwe package family.
### Example 2: Display background tasks with resource usage data
```
PS C:\> Get-AppBackgroundTask -PackageFamilyName "Microsoft.BingSports_8wekyb3d8bbwe" -IncludeResourceUsage
```
This command displays the registered background tasks that belong to the Microsoft.BingSports_8wekyb3d8bbwe package family, including detailed resource usage information.
### Example 3: Display all background tasks for a user
```
PS C:\> Get-AppBackgroundTask
```
This command displays all registered background tasks for the current user.
## PARAMETERS
### -AsJob
Runs the cmdlet as a background job. Use this parameter to run commands that take a long time to complete.
The cmdlet immediately returns an object that represents the job and then displays the command prompt.
You can continue to work in the session while the job completes.
To manage the job, use the `*-Job` cmdlets.
To get the job results, use the [Receive-Job](http://go.microsoft.com/fwlink/?LinkID=113372) cmdlet.
For more information about Windows PowerShell background jobs, see [about_Jobs](http://go.microsoft.com/fwlink/?LinkID=113251).
```yaml
Type: SwitchParameter
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -CimSession
Runs the cmdlet in a remote session or on a remote computer.
Enter a computer name or a session object, such as the output of a [New-CimSession](http://go.microsoft.com/fwlink/p/?LinkId=227967) or [Get-CimSession](http://go.microsoft.com/fwlink/p/?LinkId=227966) cmdlet.
The default is the current session on the local computer.
```yaml
Type: CimSession[]
Parameter Sets: (All)
Aliases: Session
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -IncludeResourceUsage
Indicates that the cmdlet displays detailed resource usage data for a background task.
```yaml
Type: SwitchParameter
Parameter Sets: (All)
Aliases: iru
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -PackageFamilyName
Specifies the package family name for which to display background task information.
```yaml
Type: String
Parameter Sets: (All)
Aliases: pfn
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -ThrottleLimit
Specifies the maximum number of concurrent operations that can be established to run the cmdlet.
If this parameter is omitted or a value of `0` is entered, then Windows PowerShell® calculates an optimum throttle limit for the cmdlet based on the number of CIM cmdlets that are running on the computer.
The throttle limit applies only to the current cmdlet, not to the session or to the computer.
```yaml
Type: Int32
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### CommonParameters
This cmdlet supports the common parameters: -Debug, -ErrorAction, -ErrorVariable, -InformationAction, -InformationVariable, -OutVariable, -OutBuffer, -PipelineVariable, -Verbose, -WarningAction, and -WarningVariable. For more information, see [about_CommonParameters](http://go.microsoft.com/fwlink/?LinkID=113216).
## INPUTS
## OUTPUTS
### Microsoft.Management.Infrastructure.CimInstance# MSFT_BackgroundTask[]
## NOTES
## RELATED LINKS
[Start-AppBackgroundTask](./Start-AppBackgroundTask.md)
[Unregister-AppBackgroundTask](./Unregister-AppBackgroundTask.md)
| 29.518072 | 315 | 0.78449 | eng_Latn | 0.823592 |
21ae9294fae25ef0e4e6c04e3bb625cf527dfaf7 | 697 | md | Markdown | src/routes/docs/configuring/replace-file-explorer.md | bojanskr/Website | 16038f8904ce96170ff6242af822e84adc1fad96 | [
"MIT"
] | null | null | null | src/routes/docs/configuring/replace-file-explorer.md | bojanskr/Website | 16038f8904ce96170ff6242af822e84adc1fad96 | [
"MIT"
] | 2 | 2022-03-07T06:53:11.000Z | 2022-03-07T06:53:11.000Z | src/routes/docs/configuring/replace-file-explorer.md | bojanskr/Website | 16038f8904ce96170ff6242af822e84adc1fad96 | [
"MIT"
] | null | null | null | # Replacing File Explorer with Files (Experimental)
*This setting modifies the system registry, make sure to create a backup beforehand and proceed at your own risk.
Please keep in mind that these changes are not reverted when removing the app so make sure to turn this setting off before removing the app from your device.*
**Settings Files as the default file manager**
*This is currently only available for the sideload version of Files and is not supported in the store version.*
1. Open the settings dialog in Files
2. Navigate to the experimental section and toggle the switch to set Files as the default file manager

| 53.615385 | 158 | 0.799139 | eng_Latn | 0.998902 |
21aff513d9ef3362f471173ad54d7643167ba5d5 | 541 | md | Markdown | README.md | madvlad/AcornHall | 5d41fe2253faf548af35c2f3cc0be7c57a294859 | [
"MIT"
] | 1 | 2019-11-16T00:30:54.000Z | 2019-11-16T00:30:54.000Z | README.md | madvlad/AcornHall | 5d41fe2253faf548af35c2f3cc0be7c57a294859 | [
"MIT"
] | 18 | 2018-12-11T04:43:49.000Z | 2018-12-21T02:15:11.000Z | README.md | madvlad/AcornHall | 5d41fe2253faf548af35c2f3cc0be7c57a294859 | [
"MIT"
] | null | null | null | # AcornHall
Experimental 3D Third Person Adventure Developed With Hierarchical Scene Driven Design
# Goals
## Prototype
The prototype will be declared when the following features are complete
- Basic movement controls are complete
- Basic primary attack controls are complete
- Basic secondary attack controls are complete
- Slime enemy is completed
- Test geometry temple is completed
- First version of Inventory system is completed
- One pickup item is completed
# Development Information
Log : https://swimleaks.neocities.org/logs/
| 28.473684 | 86 | 0.807763 | eng_Latn | 0.992822 |
0c7eedacc17d03e463de7bd1f598fc60abf7564f | 6,215 | md | Markdown | jamescroDocset0131213411/api-reference/beta/api/intune_deviceconfig_androidforworkvpnconfiguration_update.md | yinaa/jamescroRepo0131213411 | 7d051ec51884240ff3cf12b47edffcbe3190fd88 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-07-09T00:02:07.000Z | 2021-07-09T00:02:07.000Z | jamescroDocset0131213411/api-reference/beta/api/intune_deviceconfig_androidforworkvpnconfiguration_update.md | yinaa/jamescroRepo0131213411 | 7d051ec51884240ff3cf12b47edffcbe3190fd88 | [
"CC-BY-4.0",
"MIT"
] | 6 | 2018-03-02T20:55:47.000Z | 2018-04-05T04:42:32.000Z | jamescroDocset0131213411/api-reference/beta/api/intune_deviceconfig_androidforworkvpnconfiguration_update.md | yinaa/jamescroRepo0131213411 | 7d051ec51884240ff3cf12b47edffcbe3190fd88 | [
"CC-BY-4.0",
"MIT"
] | 3 | 2018-03-15T21:53:56.000Z | 2018-04-18T22:48:03.000Z | # Update androidForWorkVpnConfiguration
> **Important:** APIs under the /beta version in Microsoft Graph are in preview and are subject to change. Use of these APIs in production applications is not supported.
> **Note:** Using the Microsoft Graph APIs to configure Intune controls and policies still requires that the Intune service is [correctly licensed](https://go.microsoft.com/fwlink/?linkid=839381) by the customer.
Update the properties of a [androidForWorkVpnConfiguration](../resources/intune_deviceconfig_androidforworkvpnconfiguration.md) object.
## Prerequisites
One of the following permissions is required to call this API. To learn more, including how to choose permissions, see [Permissions](../../../concepts/permissions_reference.md).
|Permission type|Permissions (from most to least privileged)|
|:---|:---|
|Delegated (work or school account)|DeviceManagementConfiguration.ReadWrite.All|
|Delegated (personal Microsoft account)|Not supported.|
|Application|Not supported.|
## HTTP Request
<!-- {
"blockType": "ignored"
}
-->
``` http
PATCH /deviceManagement/deviceConfigurations/{deviceConfigurationId}
PATCH /deviceManagement/deviceConfigurations/{deviceConfigurationId}/groupAssignments/{deviceConfigurationGroupAssignmentId}/deviceConfiguration
```
## Request headers
|Header|Value|
|:---|:---|
|Authorization|Bearer <token> Required.|
|Accept|application/json|
## Request body
In the request body, supply a JSON representation for the [androidForWorkVpnConfiguration](../resources/intune_deviceconfig_androidforworkvpnconfiguration.md) object.
The following table shows the properties that are required when you create the [androidForWorkVpnConfiguration](../resources/intune_deviceconfig_androidforworkvpnconfiguration.md).
|Property|Type|Description|
|:---|:---|:---|
|id|String|Key of the entity. Inherited from [deviceConfiguration](../resources/intune_deviceconfig_deviceconfiguration.md)|
|lastModifiedDateTime|DateTimeOffset|DateTime the object was last modified. Inherited from [deviceConfiguration](../resources/intune_deviceconfig_deviceconfiguration.md)|
|createdDateTime|DateTimeOffset|DateTime the object was created. Inherited from [deviceConfiguration](../resources/intune_deviceconfig_deviceconfiguration.md)|
|description|String|Admin provided description of the Device Configuration. Inherited from [deviceConfiguration](../resources/intune_deviceconfig_deviceconfiguration.md)|
|displayName|String|Admin provided name of the device configuration. Inherited from [deviceConfiguration](../resources/intune_deviceconfig_deviceconfiguration.md)|
|version|Int32|Version of the device configuration. Inherited from [deviceConfiguration](../resources/intune_deviceconfig_deviceconfiguration.md)|
|connectionName|String|Connection name displayed to the user.|
|connectionType|String|Connection type. Possible values are: `ciscoAnyConnect`, `pulseSecure`, `f5EdgeClient`, `dellSonicWallMobileConnect`, `checkPointCapsuleVpn`, `citrix`.|
|role|String|Role when connection type is set to Pulse Secure.|
|realm|String|Realm when connection type is set to Pulse Secure.|
|servers|[vpnServer](../resources/intune_deviceconfig_vpnserver.md) collection|List of VPN Servers on the network. Make sure end users can access these network locations. This collection can contain a maximum of 500 elements.|
|fingerprint|String|Fingerprint is a string that will be used to verify the VPN server can be trusted, which is only applicable when connection type is Check Point Capsule VPN.|
|customData|[keyValue](../resources/intune_deviceconfig_keyvalue.md) collection|Custom data when connection type is set to Citrix. This collection can contain a maximum of 25 elements.|
|authenticationMethod|String|Authentication method. Possible values are: `certificate`, `usernameAndPassword`.|
## Response
If successful, this method returns a `200 OK` response code and an updated [androidForWorkVpnConfiguration](../resources/intune_deviceconfig_androidforworkvpnconfiguration.md) object in the response body.
## Example
### Request
Here is an example of the request.
``` http
PATCH https://graph.microsoft.com/beta/deviceManagement/deviceConfigurations/{deviceConfigurationId}
Content-type: application/json
Content-length: 782
{
"lastModifiedDateTime": "2017-01-01T00:00:35.1329464-08:00",
"description": "Description value",
"displayName": "Display Name value",
"version": 7,
"connectionName": "Connection Name value",
"connectionType": "pulseSecure",
"role": "Role value",
"realm": "Realm value",
"servers": [
{
"@odata.type": "microsoft.graph.vpnServer",
"description": "Description value",
"ipAddressOrFqdn": "Ip Address Or Fqdn value",
"address": "Address value",
"isDefaultServer": true
}
],
"fingerprint": "Fingerprint value",
"customData": [
{
"@odata.type": "microsoft.graph.keyValue",
"key": "Key value",
"value": "Value value"
}
],
"authenticationMethod": "usernameAndPassword"
}
```
### Response
Here is an example of the response. Note: The response object shown here may be truncated for brevity. All of the properties will be returned from an actual call.
``` http
HTTP/1.1 200 OK
Content-Type: application/json
Content-Length: 959
{
"@odata.type": "#microsoft.graph.androidForWorkVpnConfiguration",
"id": "2cf4c52c-c52c-2cf4-2cc5-f42c2cc5f42c",
"lastModifiedDateTime": "2017-01-01T00:00:35.1329464-08:00",
"createdDateTime": "2017-01-01T00:02:43.5775965-08:00",
"description": "Description value",
"displayName": "Display Name value",
"version": 7,
"connectionName": "Connection Name value",
"connectionType": "pulseSecure",
"role": "Role value",
"realm": "Realm value",
"servers": [
{
"@odata.type": "microsoft.graph.vpnServer",
"description": "Description value",
"ipAddressOrFqdn": "Ip Address Or Fqdn value",
"address": "Address value",
"isDefaultServer": true
}
],
"fingerprint": "Fingerprint value",
"customData": [
{
"@odata.type": "microsoft.graph.keyValue",
"key": "Key value",
"value": "Value value"
}
],
"authenticationMethod": "usernameAndPassword"
}
```
| 44.392857 | 226 | 0.759292 | eng_Latn | 0.684973 |
0c7f6555fbe8d6c993a458645914a15923a23fa6 | 1,126 | md | Markdown | docs/src/tool/install/index.md | jeremygiberson/reach-lang | 5c78af4cc89d08629668693751538aaa561974c9 | [
"Apache-2.0"
] | 525 | 2019-09-03T09:52:41.000Z | 2022-03-30T19:42:32.000Z | docs/src/tool/install/index.md | jeremygiberson/reach-lang | 5c78af4cc89d08629668693751538aaa561974c9 | [
"Apache-2.0"
] | 305 | 2020-01-23T23:54:31.000Z | 2022-03-31T16:31:57.000Z | docs/src/tool/install/index.md | jeremygiberson/reach-lang | 5c78af4cc89d08629668693751538aaa561974c9 | [
"Apache-2.0"
] | 148 | 2019-11-13T14:36:18.000Z | 2022-03-30T08:24:14.000Z | # {#ref-install} Installation
Reach is designed to work on POSIX systems with [make](https://en.wikipedia.org/wiki/Make_(software)), [Docker](https://www.docker.com/get-started), and [Docker Compose](https://docs.docker.com/compose/install/) installed.
The best way to install Docker on Mac and Windows is with [Docker Desktop](https://www.docker.com/products/docker-desktop).
:::note
You probably already have `make` installed.
For example, OS X and many other POSIX systems come with `make`, but some versions of Linux do not include it by default and will require you to install it.
If you're on Ubuntu, you can run `sudo apt install make` to get it.
:::
You can install Reach by running:
```cmd
$ curl https://docs.reach.sh/reach -o reach ; chmod +x reach
```
in your project repository.
You can copy this file to other repositories or move it to a directory in your `PATH`, like `~/bin`.
(`PATH` is a UNIX environment variable listing each of the directories that contain programs you can run in a shell session.)
:::note
If you're using Windows, consult [the guide to using Reach on Windows](##guide-windows).
:::
| 45.04 | 222 | 0.744227 | eng_Latn | 0.989656 |
0c7fd3d15d4392e3ca17a1c9d345abdfb74c728c | 3,844 | md | Markdown | _posts/LearnCpp/chapter06. Scope, Duration, and Linkage/2021-08-02-06.05-Variable shadowing (name hiding).md | mgtruuuu/mgtruuuu.github.io | 2fc22f2f995ea1dfef2f64201e8b112fab55322f | [
"MIT"
] | null | null | null | _posts/LearnCpp/chapter06. Scope, Duration, and Linkage/2021-08-02-06.05-Variable shadowing (name hiding).md | mgtruuuu/mgtruuuu.github.io | 2fc22f2f995ea1dfef2f64201e8b112fab55322f | [
"MIT"
] | null | null | null | _posts/LearnCpp/chapter06. Scope, Duration, and Linkage/2021-08-02-06.05-Variable shadowing (name hiding).md | mgtruuuu/mgtruuuu.github.io | 2fc22f2f995ea1dfef2f64201e8b112fab55322f | [
"MIT"
] | null | null | null | ---
title : "06.05 — Variable shadowing (name hiding)"
category :
- LearnCpp
tag :
- C++
- https://www.learncpp.com/
- shadowing
toc: true
toc_sticky: true
use_math : true
last_modified_at: 2022-06-06
---
Each block defines its own scope region. So what happens when we have a variable inside a nested block that has the same name as a variable in an outer block? When this happens, the nested variable “hides” the outer variable in areas where they are both in scope. This is called **name hiding** or **shadowing**.
## Shadowing of local variables
```c++
#include <iostream>
int main() { // outer block
// Here's the outer block apples.
int apples{ 5 };
{ // nested block
// apples refers to outer block apples here.
// Print value of outer block apples.
std::cout << apples << '\n';
// Define apples in the scope of the nested block.
int apples{ 0 };
// apples now refers to the nested block apples.
// The outer block apples is temporarily hidden.
// This assigns value 10 to nested block apples,
// not outer block apples.
apples = 10;
// Print value of nested block apples.
std::cout << apples << '\n';
} // Nested block apples destroyed.
// Prints value of outer block apples.
std::cout << apples << '\n';
} // Outer block apples destroyed.
```
Note that if the nested block apples had not been defined, the name `apples` in the nested block would still refer to the outer block `apples`, **so the assignment of value 10 to `apples` would have applied to the outer block `apples`**:
**When inside the nested block, there’s no way to directly access the shadowed variable from the outer block.**
## Shadowing of global variables
Similar to how variables in a nested block can shadow variables in an outer block, **local variables with the same name as a global variable will shadow the global variable wherever the local variable is in scope:**
```c++
#include <iostream>
int value{ 5 }; // global variable
void foo() {
// value is not shadowed here,
// so this refers to the global value.
std::cout << "global variable value: " << value << '\n';
}
int main() {
// Hides the global variable value until the end of this block.
int value{ 7 };
// Increments local value, not global value.
++value;
std::cout << "local variable value: " << value << '\n';
// "local variable value : 8"
foo();
// "global variable value : 5"
} // Local value is destroyed.
```
However, because global variables are part of the global namespace, **we can use the scope operator (`::`) with no prefix to tell the compiler we mean the global variable** instead of the local variable.
```c++
#include <iostream>
int value{ 5 }; // global variable
int main() {
// Hides the global variable value.
int value{ 7 };
// Increments local value, not global value.
++value;
// Decrements global value, not local value
// (parenthesis added for readability)
--(::value);
std::cout << "local variable value: " << value << '\n';
std::cout << "global variable value: " << ::value << '\n';
} // Local value is destroyed.
```
## Avoid variable shadowing
**Shadowing of local variables should generally be avoided, as it can lead to inadvertent errors where the wrong variable is used or modified.** *Some compilers will issue a warning when a variable is shadowed.*
For the same reason that we recommend avoiding shadowing local variables, **we recommend avoiding shadowing global variables as well**. This is trivially avoidable if all of your global names use a “g_” prefix.
>**Best practice**
Avoid variable shadowing. | 29.79845 | 312 | 0.651405 | eng_Latn | 0.996339 |
0c81cdcf2c56d9aed6d5bec876c1e0e4515dd830 | 1,612 | md | Markdown | src/blog/_posts/2009-09-15-keyboard-accessibility-presentation-at-future-of-web-design-tour-2009-in-glasgow.md | EthanHoward/devopera-1 | 06ab7ca978d6db04b229bcb6fc70dade19a802db | [
"Apache-2.0"
] | 319 | 2015-01-08T10:27:07.000Z | 2022-03-29T17:59:07.000Z | src/blog/_posts/2009-09-15-keyboard-accessibility-presentation-at-future-of-web-design-tour-2009-in-glasgow.md | EthanHoward/devopera-1 | 06ab7ca978d6db04b229bcb6fc70dade19a802db | [
"Apache-2.0"
] | 213 | 2015-01-15T15:23:40.000Z | 2022-02-10T15:54:51.000Z | src/blog/_posts/2009-09-15-keyboard-accessibility-presentation-at-future-of-web-design-tour-2009-in-glasgow.md | EthanHoward/devopera-1 | 06ab7ca978d6db04b229bcb6fc70dade19a802db | [
"Apache-2.0"
] | 457 | 2015-01-03T14:57:34.000Z | 2022-03-08T08:57:42.000Z | ---
title: Keyboard Accessibility Presentation at Future of Web Design Tour 2009 in Glasgow
authors:
- patrick-lauke
tags:
- presentation
- fowd
- accessibility
license: cc-by-3.0
---
<p>On Monday, 14 September, I had the pleasure of speaking at the Glasgow leg of the <a href="http://events.carsonified.com/fowd/2009/tour/">Future of Web Design Tour 2009</a> on the topic of <cite>Keyboard accessibility - basic steps towards a more usable and accessible site</cite>.</p>
<p>The slides are available in <a href="http://people.opera.com/patrickl/presentations/FOWD_14.09.2009/FOWD_14.09.2009.odp">OpenOffice (7.5MB)</a> and <a href="http://people.opera.com/patrickl/presentations/FOWD_14.09.2009/FOWD_14.09.2009.pdf">PDF (9.85MB)</a> format. Also, make sure to fact-check the results of my small experiment to find a <a href="http://people.opera.com/patrickl/experiments/keyboard/test">better CSS outline suppression</a>.</p>
<p>It was great to meet up with the local web design and development community, and I’m glad my presentation – and fellow Opera colleague <a href="http://www.brucelawson.co.uk">Bruce Lawson</a>’s reprise of his <a href="http://www.brucelawson.co.uk/2009/future-of-web-design-glasgow/"><cite>Future of HTML 5</cite></a> overview – went down well.</p>
<p>Many thanks to the fine folks at <a href="http://carsonified.com">Carsonified</a> for the perfect venue and flawless organisation, and a special thanks to <a href="http://forabeautifulweb.com">Andy Clarke</a> and <a href="http://boagworld.com">Paul Boag</a> for agreeing to let their sites be ripped apart and used as examples.</p>
| 100.75 | 452 | 0.752481 | eng_Latn | 0.858227 |
0c81d53bb0853ae1c56d4d0de64375a9784d12e3 | 3,766 | md | Markdown | README.md | love-bhardwaj/snyk-apps-demo | 7be141828b61d63fbfba92beb17eebbe456b5269 | [
"MIT"
] | 10 | 2021-10-07T22:18:11.000Z | 2022-03-31T00:05:14.000Z | README.md | love-bhardwaj/snyk-apps-demo | 7be141828b61d63fbfba92beb17eebbe456b5269 | [
"MIT"
] | 4 | 2021-10-05T15:44:41.000Z | 2022-03-10T17:30:26.000Z | README.md | love-bhardwaj/snyk-apps-demo | 7be141828b61d63fbfba92beb17eebbe456b5269 | [
"MIT"
] | 6 | 2021-10-07T14:45:42.000Z | 2022-03-01T16:45:37.000Z | # Snyk Apps Demo
This is a demo app that can be used as a guide on how to create Snyk Apps. This repository contains a simple implementation of a Snyk App written in [Typescript](https://www.typescriptlang.org/), [NodeJS](https://nodejs.org/en/) and [EJS](https://ejs.co/).
## Important Note
As mentioned above this demo Snyk App has been written in [Typescript](https://www.typescriptlang.org/), [NodeJS](https://nodejs.org/en/) and [EJS](https://ejs.co/), but developers can use any preferred language or framework of their choice to create a Snyk App.
Also important to mention that we are using [passportjs](https://www.passportjs.org/) for the authentication process with our very own passport strategy [@snyk/passport-snyk-oauth2](https://www.npmjs.com/package/@snyk/passport-snyk-oauth2). Developers can use any available `oauth2` client of their choice or write the authentication code from scratch following our [Snyk Apps Docs](https://docs.snyk.io/features/integrations/snyk-apps).
## Requirements:
- `node` version 10 or greater
- `npm` version 6 or greater
## Getting started:
- Clone the repo `$ git clone https://github.com/snyk/snyk-apps-demo`
- Install all the required dependencies: `$ npm ci` or `npm install`
## Create a new Snyk App:
The first thing you need to do is create an app. If you haven't already created a Snyk App, you can do so via our create script:
```shell
$ npm run create-app -- --authToken=$token --orgId=$id --scopes=$scopes --name="$name"
```
Ex:
```shell
$ npm run create-app -- --authToken=some-token --orgId=some-snyk-org-id --scopes=apps:beta --name=test-snyk-app
```
or with `redirectUris`
```shell
$ npm run create-app -- --authToken=some-token --orgId=some-snyk-org-id --redirect-uris=https://your-domain/callback --scopes=apps:beta --name=test-snyk-app
```
(note the extra `--` between `create-app` and the parameters)
- `authToken`(**Required**/**String**): your personal Snyk auth token, obtained from [your account settings page](https://app.snyk.io/account)
- `orgId` (**Required**/**String**): the organization id that you want to own the Snyk App (obtained by clicking the cog in the upper right corner of the Snyk console)
- `redirectUris` (**Optional**/**String Array**): a space separated list of redirect uris for your app, defaults to `http://localhost:3000/callback` when no input provided
- `scopes` (**Required**/**String Array**): a space separated list of scopes you want your App to be able to request at install time (see [Snyk Apps docs](https://docs.snyk.io/integrations/snyk-apps) for allowed values)
- `name` (**Required**/**String**): the friendly name of your Snyk App
This will register your new app with Snyk and create the `.env` file (see below) with your new `CLIENT_ID`, `CLIENT_SECRET`, `REDIRECT_URI`, `SCOPES` and `ENCRYPTION_SECRET`. Keep these values secure!
- `CLIENT_ID`: the client id associated with your Snyk App
- `CLIENT_SECRET`: super secret client secret associated with your Snyk App
- `REDIRECT_URI`: the redirect uri used by your Snyk App
- `SCOPES`: the space-separated list of scopes for your Snyk App
- `ENCRYPTION_SECRET`: secret encryption key used by the demo app to encrypt sensitive data
## Running the Demo Snyk App:
1. Run the following command to compile TypeScript into JavaScript
```
$ npm run build
```
2. Once the TypeScript has been compiled to JavaScript(into `./dist` directory) run
```
$ npm run dev
```
3. Go to [localhost:3000](http://localhost:3000) to confirm that the app is running successfully
## The .env File:
The `.env` file is used to store environmental variables. Ensure this remains secret! If you've already created a Snyk App, you can copy `.env.example` and set the values.
| 45.373494 | 437 | 0.72889 | eng_Latn | 0.954894 |
0c824559de4136c5739fbe3bdddf5e479756748e | 1,450 | md | Markdown | README.md | parente/helloworld-transifex-rtd | 6b18fe838ae0b3dbc22bc8ae06181e6d676f8c00 | [
"BSD-3-Clause"
] | 1 | 2021-03-19T09:08:42.000Z | 2021-03-19T09:08:42.000Z | README.md | parente/helloworld-transifex-rtd | 6b18fe838ae0b3dbc22bc8ae06181e6d676f8c00 | [
"BSD-3-Clause"
] | null | null | null | README.md | parente/helloworld-transifex-rtd | 6b18fe838ae0b3dbc22bc8ae06181e6d676f8c00 | [
"BSD-3-Clause"
] | null | null | null | # helloworld-transifex-rtd
This project demonstrates a continuous integration and deployment workflow involving
Sphinx, Transifex, GitHub Actions, and ReadTheDocs. I created it to supplement the documentation in
https://github.com/jupyter/jupyter/pull/475 describing how to enable translations of Sphinx
documentation for Project Jupyter.

Points of interest
- The Sphinx configuration `docs/source/conf.py` which configures the path where `.po` language
files reside
- The GitHub Action workflow `.github/workflows/gettext.yml` which regenerates and commits the
English `.po` files automatically any time the English Markdown or restructuredText documentation
changes on the `master` branch
- https://github.com/parente/helloworld-transifex-rtd/commit/9c2b4212ad - an example commit made by
the GitHub Action workflow after I added a new text to `second.md`
- https://github.com/parente/helloworld-transifex-rtd/pull/1 - an example pull request submitted by
Transifex after I translated all of the English text in the `first.po` file to Spanish using the
Transifex web app
- https://helloworld-transifex-rtd.readthedocs.io/en/latest/ - the English docs hosted on
ReadTheDocs
- https://helloworld-transifex-rtd.readthedocs.io/es/latest/ - the translated Spanish docs hosted on
ReadTheDocs
| 55.769231 | 141 | 0.802069 | eng_Latn | 0.956949 |
0c8361e73c7f9f1e3905447509a779cbc01b6136 | 3,944 | md | Markdown | CHANGELOG.md | blsim/residue-cpp | 2f052a6fab6798fc700a55ea29febf3189236226 | [
"Apache-2.0"
] | 2 | 2017-08-16T03:32:34.000Z | 2018-10-15T03:00:55.000Z | CHANGELOG.md | blsim/residue-cpp | 2f052a6fab6798fc700a55ea29febf3189236226 | [
"Apache-2.0"
] | null | null | null | CHANGELOG.md | blsim/residue-cpp | 2f052a6fab6798fc700a55ea29febf3189236226 | [
"Apache-2.0"
] | 2 | 2020-03-13T18:36:16.000Z | 2021-11-06T01:59:19.000Z | # Change Log
## [2.1.4] - 24-11-2018
- Updated license
- Updated Easylogging++ to 9.96.7
## [2.1.3] - 07-09-2018
### Updated
- Updated Easylogging++ to 9.96.5
## [2.1.2] - 28-03-2018
### Fixes
- Fix `RESIDUE_HOME` if not available
## [2.1.1] - 27-03-2018
### Updates
- Moved exceptions out of include for native bindings
## [2.1.0] - 25-03-2018
### API Updates
- Added `loadConfigurationFromJson` to load from JSON parameter
- Added `loadConnection` and `saveConnection`
### Updates
- Updated internal networking library (asio) to 1.12.0
- Client private key secret must be hex encoded now
- Configurations now support `RESIDUE_HOME` environment variable
## [2.0.1] - 21-03-2018
- Fix disconnect
## [2.0.0] - 01-03-2018
### Fixes
- Compatibility for server 2.0.0
- Updated Easylogging++ to 9.96.2
## [1.2.3] - 23-02-2018
### Updates
- Removed plain log request to match server 1.5+
- Updated Easylogging++ to 9.96.1
## [1.2.2]
### Updates
- Upgraded Easylogging++ from 9.95.4 to 9.96.0
## [1.2.1]
### Updates
- Separated translation units for development
## [1.1.0]
### Updates
- Removed dependency on linked boost
- Include easylogging++ with packages to avoid conflicts
- Residue headers are now installed in `residue/` directory since it contains it's own version of Easylogging++
## [1.0.2]
### Updates
- License information update
- Ripe upgraded to 4.1.1
### Fixes
- Fix licensee crash issue
## [1.0.1] - 06-10-2017
### Changes
- Compatibility with residue v1.2.0
- Added `serverVersion` under `Residue::instance()`
## [1.0.0] - 28-09-2017
### Fixes
- Static linking of crypto libs
## [1.0.0-beta.17] - 25-09-2017
### Updates
- A lot of minor internal updates with data types and regression testing
## [1.0.0-beta.16] - 13-08-2017
### Updates
- Updated to match same configuration name
### Changes
- Changed header text for files
## [1.0.0-beta.15] - 07-08-2017
### Changes
- Minor speed improvement with bulk loader when connecting
## [1.0.0-beta.14] - 03-08-2017
### Fixes
- Fixed issue with logging DEBUG when built with Release build (use Easylogging++ v9.95.0+)
### Added
- Use of `CHECK_TOKENS` server flag to reduce overhead of pulling token when not needed
- Ability to re-estabilish connection if disconnected from remote
### Changed
- Increased `TOUCH_THRESHOLD` to 2 minutes
## [1.0.0-beta.13] - 02-08-2017
### Fixed
- Fixed compression flag
### Added
- Internal logging level helper enum class
## [1.0.0-beta.12] - 27-07-2017
### Added
- Ability to set internal logging level via configuration using `internal_logging_level`
### Fixed
- Fixed issue with pinging client when client_age < 60
## [1.0.0-beta.11] - 22-07-2017
### Added
- Provide RSA key secret with `secret_key`
## [1.0.0-beta.4] - 07-07-2017
### Fixed
- Fixed dead lock on `reset()`
## [1.0.0-beta.3] - 09-05-2017
### Fixed
- Fixed issue with failing to connect to token and/or logging server. Now throws exception
- Error text on failure
- Fixed exception throwing in `connect()`
- Fixed issue with re-connecting broken socket
## [1.0.0-beta.2] - 20-04-2017
### Added
- Ability to specify server public key
- Added `Residue::setThreadName` (wrapper for `el::Helpers::setThreadName`)
- Added `Residue::setInternalLoggingLevel` for internal logging
- Added `Residue::setApplicationArgs` (wrapper for `START_EASYLOGGINGPP`)
- Added `Residue::reconnect()`
- Added `Residue::moveAccessCodeMap`
- Added `Residue::connect(host, port)` without access code map to be able to connect to different host using existing map
- Added `Residue::enableCrashHandler`
- Added JSON configuration helper `Residue::loadConfiguration`
### Changes
- By default `AutoBulkParams` is now enabled
## [1.0.0-beta] - 31-03-2017
### Added
- Support sending plain log requests in lib
### Fixed
- Issue with dead client and resetting connection caused issue with dispatcher thread in client lib
## [1.0.0-alpha] - 19-03-2017
### Added
- Initial alpha release
| 25.777778 | 121 | 0.706136 | eng_Latn | 0.917173 |
0c8382da2a90203ecc23d32a76596ba22c44fd67 | 10,996 | md | Markdown | docs/api/walkthrough-testing-the-backend-services-workflow.md | sitedata/SaraAlert | 5a96f83b1b1a7179592ea285f9a2bc3d19c1b222 | [
"Apache-2.0"
] | 30 | 2020-03-11T16:00:27.000Z | 2022-01-12T12:13:56.000Z | docs/api/walkthrough-testing-the-backend-services-workflow.md | sitedata/SaraAlert | 5a96f83b1b1a7179592ea285f9a2bc3d19c1b222 | [
"Apache-2.0"
] | 412 | 2020-03-11T13:16:44.000Z | 2022-03-30T21:11:06.000Z | docs/api/walkthrough-testing-the-backend-services-workflow.md | sitedata/SaraAlert | 5a96f83b1b1a7179592ea285f9a2bc3d19c1b222 | [
"Apache-2.0"
] | 20 | 2020-04-09T19:59:29.000Z | 2022-02-15T13:59:28.000Z | ---
layout: default
title: "Walkthrough: Testing the Backend Services Workflow"
parent: API
nav_order: 6
---
This page documents a set of steps to connect to the Sara Alert FHIR API using the SMART on FHIR Backend Services Workflow as described in [Getting Started](api-getting-started#smart-on-fhir-backend-services-workflow).
**If you are testing against a local instance of Sara Alert, follow all of the instructions below. If you are testing against the demo server, ignore sections with (LOCAL TESTING ONLY), and note that some numbers in the list of instructions will be skipped.**
## Setup the Environment (LOCAL TESTING ONLY)
If you are testing on the demo server, skip to the next section.
<details>
<summary>Expand only if testing on a LOCAL instance of Sara Alert</summary>
<div markdown="1">
**1.** Clone and run Sara Alert following the steps in the [README](https://github.com/SaraAlert/SaraAlert/blob/master/README.md) for local setup. Make sure to have the database, Redis, and Sidekiq running for the full experience. At a minimum, the database and Redis need to be running.
**2.** Optionally, connect to the database to query some of the tables as we go through the workflow using `mysql --user=disease_trakker`
</div>
</details>
## Create a JSON Web Key (JWK)
**3.** For this tutorial, use <https://mkjwk.org>. Create a JWK with the following settings: Key Size `2048`, Key Use `Signature`, Algorithm `RS384`, Key ID `SHA-256`, Show X.509 `Yes`. Click the "Generate" button.

Either keep this tool open with the generated values or save off all of the displayed values somewhere:
- Public and Private Keypair
- Public and Private Keypair Set
- Public Key
- Private Key (X.509 PEM Format)
- Self-Signed Certificate
- Public Key (X.509 PEM Format)
## Register a New API Client Application (LOCAL TESTING ONLY)
If you are testing on the demo server, skip to the next section.
<details>
<summary>Expand only if testing on a LOCAL instance of Sara Alert</summary>
<div markdown="1">
**4.** Run the `admin:create_oauth_app_for_backend_services_workflow` rake task to both create a new "shadow user" to be used by this new application when creating/updating records, and to create the new OAuth application as well. This rake task requires that you first set an environment variable called `API_FILE_PATH` to the path of a json file that contains needed data.
For example, if there is a file named `api_data.json` that looks like the following:
```
{
"app_name": "test-m2m-app",
"email": "[email protected]",
"jurisdiction_path": "USA",
"public_key_set": {
"keys": [<PUBLIC_KEY>]
},
"scopes": "system/Patient.* system/Observation.read system/QuestionnaireResponse.read user/Patient.* user/Observation.read user/QuestionnaireResponse.read",
"redirect_uri": "urn:ietf:wg:oauth:2.0:oob"
}
```
You can then set the environment variable:
```
export API_FILE_PATH="path/to/api_data.json"
```
and then run the rake task.
```
bundle exec rake admin:create_oauth_app_for_backend_services_workflow
```
You will see the Client ID of the shadow user and OAuth Application as part of the output:
```
Successfully created user with ID <GENERATED_USER_ID> and email [email protected]!
Successfully created user with OAuth Application!
Client ID: <GENERATED_CLIENT_ID>
```
**5.** OPTIONAL: Verify the application was properly registered by querying the database.
```
mysql> select * from oauth_applications;
+----+-----------+--------------------+-----------------------+--------------------------------+-----------------------------------+--------------+----------------------------+----------------------------+---------------------+-----------------+
| id | name | uid | secret | redirect_uri | scopes | confidential | created_at | updated_at | public_key_set | jurisdiction_id |
+----+-----------+--------------------+-----------------------+--------------------------------+-----------------------------------+--------------+----------------------------+----------------------------+---------------------+-----------------+
| 1 | demo | demo-oauth-app-uid | demo-oauth-app-secret | http://localhost:4000/redirect | user/Patient.* | 1 | 2020-06-02 13:22:47.550013 | 2020-06-02 13:22:47.550013 | NULL | NULL |
| | | | | | user/Observation.read | | | | | |
| | | | | | user/QuestionnaireResponse.read | | | | | |
| 4 | myTestApp | myTestApp | <ABRIDGED> | urn:ietf:wg:oauth:2.0:oob | system/Patient.* | 1 | 2020-09-08 20:15:11.183139 | 2020-09-08 20:15:11.183139 | ---keys: <ABRIDGED> | 1 |
| | | | | | system/Observation.read | | | | | |
| | | | | | system/QuestionnaireResponse.read | | | | | |
+----+-----------+--------------------+-----------------------+--------------------------------+-----------------------------------+--------------+----------------------------+----------------------------+---------------------+-----------------+
2 rows in set (0.00 sec)
```
</div>
</details>
## Request Access Token: Create a Signed JWT
**6.** We need a signed JWT to request an access token. In the tutorial use <https://jwt.io/#debugger-io>
In the `Decoded` section, enter the following `HEADER`:
```json
{
"alg":"RS384",
"kid":"<KID FROM PUBLIC KEY>",
"typ":"JWT"
}
```
In the `PAYLOAD` section enter:
If using DEMO server:
```javascript
{
"iss":"myTestApp", // Example value that should be replaced with your Client ID
"sub":"myTestApp", // Example value that should be replaced with your Client ID
"aud":"https://demo.saraalert.org/oauth/token",
"exp":1599600491, // Make sure this time is in the future otherwise you will see a SignatureExpired error
"jti":"1599600191" // Must be a random unique identifier for this JWT
}
```
<details>
<summary> OR: Expand if using LOCAL server</summary>
<div markdown="1">
```javascript
{
"iss":"myTestApp", // Example value that should be replaced with your Client ID
"sub":"myTestApp", // Example value that should be replaced with your Client ID
"aud":"http://localhost:3000/oauth/token",
"exp":1599600491, // Make sure this time is in the future otherwise you will see a SignatureExpired error
"jti":"1599600191" // Must be a random unique identifier for this JWT
}
```
</div>
</details>

Set the `"exp"` field to be 5 minutes in the future (this is time in seconds since 1 Jan 1970), and set the `"jti"` to be random non-repeating number.
In the `VERIFY SIGNATURE` field, enter your `PUBLIC KEY` and `PRIVATE KEY` from the `Public Key (X.509 PEM Format)` and `Private Key (X.509 PEM Format)` fields that you generated in Step #3.
Copy the JWT from the `Encoded` field. It should look garbled like this:
```
eyJhbGciOiJSUzM4NCIsImtpZCI6IjNpYlRWLUk0NFppNExza3hIellYeHpVNWpfNThqX0NxRzJiY3lKT0Z1bnciLCJ0eXAiOiJKV1QifQ.eyJpc3MiOiJteVRlc3RBcHAiLCJzdWIiOiJteVRlc3RBcHAiLCJhdWQiOiJodHRwOi8vbG9jYWxob3N0OjMwMDAvb2F1dGgvdG9rZW4iLCJleHAiOjE1OTk2MDA0OTEsImp0aSI6MTU5OTYwMDE5MX0.OljK-13DGC6RvpHTgCFG0FgyFsEAlwcWIA8AEtzr_LrMJ8cTCWUYuLWNBR6TL6fiFIeW5vDJJdQ8zDUZC_rOMN-U-_oIulWNTWzEib3re0-ST8s3d1QFaZwgsa53C7m7WKUNvdEoKl5VA-YUjxayKQ3xbjUqR1aTy5IVkWeFi3iV0s1S53I6ZdpmiKP5MgCkXnLlWHehg10k4Ro571iOd54cphsrDueiCQBF7P88CoWsrV3uUhFnFSBR53JHWzYDX3-LYVDf1VJB_N8h_maD81MMbmGP7QucsXipQvsAA6G9ZfFzj9trvhRpI-Pk47G7aCca1raGMUja8AySybD0ng
```
We are going to use your JWT in the next step, Request an Access Token.
## Request an Access Token: Build Request
**7.** Using Postman, curl, or whatever HTTP library you like request an Access Token...
REQUEST
IF USING DEMO server:
```
curl --location --request POST 'https://demo.saraalert.org/oauth/token' \
--header 'Content-Type: application/x-www-form-urlencoded' \
--data-urlencode 'scope=system/Patient.* system/Observation.read system/QuestionnaireResponse.read' \
--data-urlencode 'grant_type=client_credentials' \
--data-urlencode 'client_assertion_type=urn:ietf:params:oauth:client-assertion-type:jwt-bearer' \
--data-urlencode 'client_assertion=<JWT FROM STEP 6>' \
--data-urlencode 'client_id=myTestApp'
```
<details>
<summary> OR: Expand if using LOCAL server</summary>
<div markdown="1">
```
curl --location --request POST 'http://localhost:3000/oauth/token' \
--header 'Content-Type: application/x-www-form-urlencoded' \
--data-urlencode 'scope=system/Patient.* system/Observation.read system/QuestionnaireResponse.read' \
--data-urlencode 'grant_type=client_credentials' \
--data-urlencode 'client_assertion_type=urn:ietf:params:oauth:client-assertion-type:jwt-bearer' \
--data-urlencode 'client_assertion=<JWT FROM STEP 6>' \
--data-urlencode 'client_id=myTestApp'
```
</div>
</details>
Make sure you use the proper `client_id` and `scope` that you registered in previous steps.
RESPONSE
```
{
"access_token": "fXHoedJMq-mdf8cqQvw5a4AY7SOb92McbJvDzNSP5q4",
"token_type": "Bearer",
"expires_in": 7200,
"scope": "system/Patient.* system/Observation.read system/QuestionnaireResponse.read",
"created_at": 1599601092
}
```
We are going to use the `"access_token"` value in API requests.
## FHIR Requests
**8.** Using Postman, curl, or whatever HTTP library you like request some FHIR Resources...
REQUEST
If using DEMO server:
```
curl --location --request GET 'https://demo.saraalert.org/fhir/r4/Patient/1' \
--header 'Accept: application/fhir+json' \
--header 'Authorization: Bearer fXHoedJMq-mdf8cqQvw5a4AY7SOb92McbJvDzNSP5q4'
```
<details>
<summary>OR: Expand if using LOCAL server</summary>
<div markdown="1">
```
curl --location --request GET 'http://localhost:3000/fhir/r4/Patient/1' \
--header 'Accept: application/fhir+json' \
--header 'Authorization: Bearer fXHoedJMq-mdf8cqQvw5a4AY7SOb92McbJvDzNSP5q4'
```
</div>
</details>
Make sure you replace the token in the example with the token you obtained in Step #7.
The response should be an HTTP 200 with a JSON formatted FHIR Patient.
| 48.440529 | 601 | 0.622317 | eng_Latn | 0.762643 |
0c8390d4bacc8987c6756acd32f289d45e7d72e6 | 25,846 | md | Markdown | myblog/docs/backend/python/X_reviewboard_i18n.md | trickytheguest/vueblog | ac0770c15d392ee8bda357e2f8940ac1bc333962 | [
"MIT"
] | 2 | 2020-04-20T15:57:28.000Z | 2021-10-19T06:01:21.000Z | myblog/docs/backend/python/X_reviewboard_i18n.md | trickytheguest/vueblog | ac0770c15d392ee8bda357e2f8940ac1bc333962 | [
"MIT"
] | 5 | 2021-07-02T14:51:33.000Z | 2022-03-05T15:35:40.000Z | myblog/docs/backend/python/X_reviewboard_i18n.md | trickytheguest/vueblog | ac0770c15d392ee8bda357e2f8940ac1bc333962 | [
"MIT"
] | 6 | 2020-09-01T00:41:11.000Z | 2022-02-07T03:29:42.000Z | # ReviewBoard国际化配置
[[toc]]
ReviewBoard是一个代码评审系统,当前默认没有配置中文翻译,本节主要讲解如何将ReviewBoard系统汉化。
## 基本信息
查看Django版本:
```sh
[root@helloreview ~]# pip list|grep Django
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
Django 1.6.11
```
查看Django安装目录
```sh
[root@helloreview ~]# ls -lah /usr/lib/python2.7/site-packages/django
total 36K
drwxr-xr-x 17 root root 255 Aug 27 11:01 .
drwxr-xr-x. 95 root root 8.0K Sep 5 20:23 ..
drwxr-xr-x 3 root root 217 Aug 27 11:01 bin
drwxr-xr-x 6 root root 168 Sep 5 22:38 conf
drwxr-xr-x 19 root root 317 Aug 27 11:01 contrib
drwxr-xr-x 10 root root 4.0K Aug 27 11:01 core
drwxr-xr-x 4 root root 153 Aug 27 11:01 db
drwxr-xr-x 2 root root 125 Aug 27 11:01 dispatch
drwxr-xr-x 3 root root 269 Aug 27 11:01 forms
drwxr-xr-x 2 root root 242 Aug 27 11:01 http
-rw-r--r-- 1 root root 270 Aug 27 11:00 __init__.py
-rw-r--r-- 1 root root 465 Aug 27 11:01 __init__.pyc
drwxr-xr-x 2 root root 4.0K Sep 1 09:52 middleware
drwxr-xr-x 2 root root 45 Aug 27 11:01 shortcuts
drwxr-xr-x 3 root root 4.0K Aug 27 11:01 template
drwxr-xr-x 2 root root 237 Aug 27 11:01 templatetags
drwxr-xr-x 2 root root 331 Aug 27 11:01 test
drwxr-xr-x 5 root root 4.0K Aug 27 11:01 utils
drwxr-xr-x 4 root root 247 Aug 27 11:01 views
```
查看ReviewBoard版本
```sh
[root@helloreview ~]# pip list|grep -i Review
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
ReviewBoard 3.0.12
```
ReviewBoard相关目录
```sh
[root@helloreview ~]# ls -lah /usr/lib64/python2.7/site-packages/reviewboard
total 240K
drwxr-xr-x 29 root root 4.0K Sep 6 13:00 .
drwxr-xr-x. 33 root root 4.0K Sep 6 14:00 ..
drwxr-xr-x 8 root root 4.0K Sep 2 19:58 accounts
drwxr-xr-x 5 root root 4.0K Sep 5 23:37 admin
drwxr-xr-x 5 root root 4.0K Aug 27 11:01 attachments
drwxr-xr-x 3 root root 254 Sep 5 23:41 avatars
drwxr-xr-x 3 root root 164 Aug 27 11:01 changedescs
drwxr-xr-x 3 root root 158 Aug 27 11:01 cmdline
drwxr-xr-x 2 root root 298 Sep 2 19:59 datagrids
-rw-r--r-- 1 root root 5.2K Aug 27 10:59 dependencies.py
-rw-r--r-- 1 root root 5.3K Aug 27 11:01 dependencies.pyc
-rw-r--r-- 1 root root 1.2K Aug 27 10:59 deprecation.py
-rw-r--r-- 1 root root 1.6K Aug 27 11:01 deprecation.pyc
drwxr-xr-x 7 root root 4.0K Sep 4 21:24 diffviewer
drwxr-xr-x 6 root root 281 Aug 27 11:01 extensions
drwxr-xr-x 2 root root 117 Aug 27 11:01 features
drwxr-xr-x 7 root root 4.0K Aug 27 11:01 hostingsvcs
drwxr-xr-x 4 root root 33 Aug 27 11:01 htdocs
-rw-r--r-- 1 root root 5.0K Aug 27 10:59 __init__.py
-rw-r--r-- 1 root root 4.4K Aug 27 11:01 __init__.pyc
drwxr-xr-x 3 root root 254 Aug 27 11:01 integrations
drwxr-xr-x 9 root root 91 Sep 1 10:17 locale
-rw-r--r-- 1 root root 12K Aug 27 10:59 manage.py
-rw-r--r-- 1 root root 8.5K Aug 27 11:01 manage.pyc
-rw-r--r-- 1 root root 179 Aug 27 10:59 nose.cfg
drwxr-xr-x 6 root root 288 Aug 27 11:01 notifications
drwxr-xr-x 3 root root 4.0K Aug 27 11:01 oauth
-rw-r--r-- 1 root root 949 Aug 27 10:59 rb_platform.py
-rw-r--r-- 1 root root 635 Aug 27 11:01 rb_platform.pyc
drwxr-xr-x 2 root root 117 Aug 27 11:01 registries
drwxr-xr-x 2 apache apache 29 Aug 30 21:52 reviewboardlog
drwxr-xr-x 8 root root 4.0K Sep 5 22:35 reviews
drwxr-xr-x 9 root root 4.0K Aug 27 11:01 scmtools
drwxr-xr-x 3 root root 4.0K Aug 27 11:01 search
-rw-r--r-- 1 root root 17K Sep 5 22:09 settings.py
-rw-r--r-- 1 root root 2.0K Aug 27 10:59 signals.py
-rw-r--r-- 1 root root 1.8K Aug 27 11:01 signals.pyc
drwxr-xr-x 5 root root 4.0K Aug 30 22:33 site
drwxr-xr-x 2 root root 253 Aug 27 11:01 ssh
drwxr-xr-x 4 root root 27 Aug 27 11:01 static
-rw-r--r-- 1 root root 23K Aug 27 10:59 staticbundles.py
-rw-r--r-- 1 root root 34K Aug 27 11:01 staticbundles.pyc
drwxr-xr-x 17 root root 326 Aug 27 11:01 templates
drwxr-xr-x 2 root root 176 Aug 27 11:01 testing
-rw-r--r-- 1 root root 1.7K Aug 27 10:59 test.py
-rw-r--r-- 1 root root 2.3K Aug 27 11:01 test.pyc
-rw-r--r-- 1 root root 1.7K Aug 27 10:59 tests.py
-rw-r--r-- 1 root root 2.3K Aug 27 11:01 tests.pyc
-rw-r--r-- 1 root root 3.9K Aug 27 10:59 urls.py
-rw-r--r-- 1 root root 3.5K Aug 27 11:01 urls.pyc
drwxr-xr-x 5 root root 4.0K Sep 4 23:27 webapi
```
ReviewBoard静态文件目录
```sh
[root@helloreview ~]# ls -lah /var/www/html/reviewboard/
total 0
drwxr-xr-x 7 apache apache 67 Sep 1 20:28 .
drwxr-xr-x. 3 root root 34 Sep 6 13:15 ..
drwxr-xr-x 2 apache apache 98 Sep 5 21:45 conf
drwxr-xr-x 3 apache apache 25 Aug 27 11:52 data
drwxr-xr-x 5 apache apache 74 Aug 31 19:52 htdocs
drwxr-xr-x 2 apache apache 6 Aug 27 11:28 logs
drwxrwxrwx 2 apache apache 6 Aug 27 11:28 tmp
```
## 配置文件
- Django默认配置文件:`/usr/lib/python2.7/site-packages/django/conf/global_settings.py`
- ReviewBoard的Django项目配置文件:`/usr/lib64/python2.7/site-packages/reviewboard/settings.py`
- ReviewBoard网站配置文件: `/var/www/html/reviewboard/conf/settings_local.py`
Django关于语言和时区的配置
```sh
[root@hellolinux ~]# cat -n /usr/lib/python2.7/site-packages/django/conf/global_settings.py|sed -n '36,144p'
36 # Local time zone for this installation. All choices can be found here:
37 # http://en.wikipedia.org/wiki/List_of_tz_zones_by_name (although not all
38 # systems may support all possibilities). When USE_TZ is True, this is
39 # interpreted as the default user time zone.
40 TIME_ZONE = 'America/Chicago'
41
42 # If you set this to True, Django will use timezone-aware datetimes.
43 USE_TZ = False
44
45 # Language code for this installation. All choices can be found here:
46 # http://www.i18nguy.com/unicode/language-identifiers.html
47 LANGUAGE_CODE = 'en-us'
48
49 # Languages we provide translations for, out of the box.
50 LANGUAGES = (
51 ('af', gettext_noop('Afrikaans')),
52 ('ar', gettext_noop('Arabic')),
53 ('az', gettext_noop('Azerbaijani')),
54 ('bg', gettext_noop('Bulgarian')),
55 ('be', gettext_noop('Belarusian')),
56 ('bn', gettext_noop('Bengali')),
57 ('br', gettext_noop('Breton')),
58 ('bs', gettext_noop('Bosnian')),
59 ('ca', gettext_noop('Catalan')),
60 ('cs', gettext_noop('Czech')),
61 ('cy', gettext_noop('Welsh')),
62 ('da', gettext_noop('Danish')),
63 ('de', gettext_noop('German')),
64 ('el', gettext_noop('Greek')),
65 ('en', gettext_noop('English')),
66 ('en-gb', gettext_noop('British English')),
67 ('eo', gettext_noop('Esperanto')),
68 ('es', gettext_noop('Spanish')),
69 ('es-ar', gettext_noop('Argentinian Spanish')),
70 ('es-mx', gettext_noop('Mexican Spanish')),
71 ('es-ni', gettext_noop('Nicaraguan Spanish')),
72 ('es-ve', gettext_noop('Venezuelan Spanish')),
73 ('et', gettext_noop('Estonian')),
74 ('eu', gettext_noop('Basque')),
75 ('fa', gettext_noop('Persian')),
76 ('fi', gettext_noop('Finnish')),
77 ('fr', gettext_noop('French')),
78 ('fy-nl', gettext_noop('Frisian')),
79 ('ga', gettext_noop('Irish')),
80 ('gl', gettext_noop('Galician')),
81 ('he', gettext_noop('Hebrew')),
82 ('hi', gettext_noop('Hindi')),
83 ('hr', gettext_noop('Croatian')),
84 ('hu', gettext_noop('Hungarian')),
85 ('ia', gettext_noop('Interlingua')),
86 ('id', gettext_noop('Indonesian')),
87 ('is', gettext_noop('Icelandic')),
88 ('it', gettext_noop('Italian')),
89 ('ja', gettext_noop('Japanese')),
90 ('ka', gettext_noop('Georgian')),
91 ('kk', gettext_noop('Kazakh')),
92 ('km', gettext_noop('Khmer')),
93 ('kn', gettext_noop('Kannada')),
94 ('ko', gettext_noop('Korean')),
95 ('lb', gettext_noop('Luxembourgish')),
96 ('lt', gettext_noop('Lithuanian')),
97 ('lv', gettext_noop('Latvian')),
98 ('mk', gettext_noop('Macedonian')),
99 ('ml', gettext_noop('Malayalam')),
100 ('mn', gettext_noop('Mongolian')),
101 ('my', gettext_noop('Burmese')),
102 ('nb', gettext_noop('Norwegian Bokmal')),
103 ('ne', gettext_noop('Nepali')),
104 ('nl', gettext_noop('Dutch')),
105 ('nn', gettext_noop('Norwegian Nynorsk')),
106 ('os', gettext_noop('Ossetic')),
107 ('pa', gettext_noop('Punjabi')),
108 ('pl', gettext_noop('Polish')),
109 ('pt', gettext_noop('Portuguese')),
110 ('pt-br', gettext_noop('Brazilian Portuguese')),
111 ('ro', gettext_noop('Romanian')),
112 ('ru', gettext_noop('Russian')),
113 ('sk', gettext_noop('Slovak')),
114 ('sl', gettext_noop('Slovenian')),
115 ('sq', gettext_noop('Albanian')),
116 ('sr', gettext_noop('Serbian')),
117 ('sr-latn', gettext_noop('Serbian Latin')),
118 ('sv', gettext_noop('Swedish')),
119 ('sw', gettext_noop('Swahili')),
120 ('ta', gettext_noop('Tamil')),
121 ('te', gettext_noop('Telugu')),
122 ('th', gettext_noop('Thai')),
123 ('tr', gettext_noop('Turkish')),
124 ('tt', gettext_noop('Tatar')),
125 ('udm', gettext_noop('Udmurt')),
126 ('uk', gettext_noop('Ukrainian')),
127 ('ur', gettext_noop('Urdu')),
128 ('vi', gettext_noop('Vietnamese')),
129 ('zh-cn', gettext_noop('Simplified Chinese')),
130 ('zh-tw', gettext_noop('Traditional Chinese')),
131 )
132
133 # Languages using BiDi (right-to-left) layout
134 LANGUAGES_BIDI = ("he", "ar", "fa", "ur")
135
136 # If you set this to False, Django will make some optimizations so as not
137 # to load the internationalization machinery.
138 USE_I18N = True
139 LOCALE_PATHS = ()
140 LANGUAGE_COOKIE_NAME = 'django_language'
141
142 # If you set this to True, Django will format dates, numbers and calendars
143 # according to user current locale.
144 USE_L10N = False
```
ReviewBoard关于语言和时区的配置:
```sh
[root@hellolinux ~]# cat -n /usr/lib64/python2.7/site-packages/reviewboard/settings.py|sed -n '29,70p'
29 # Time zone support. If enabled, Django stores date and time information as
30 # UTC in the database, uses time zone-aware datetime objects, and translates
31 # them to the user's time zone in templates and forms.
32 USE_TZ = True
33
34 # Local time zone for this installation. All choices can be found here:
35 # http://www.postgresql.org/docs/8.1/static/datetime-keywords.html#DATETIME-TIMEZONE-SET-TABLE
36 # When USE_TZ is enabled, this is used as the default time zone for datetime
37 # objects
38 TIME_ZONE = 'UTC'
39
40 # Language code for this installation. All choices can be found here:
41 # http://www.w3.org/TR/REC-html40/struct/dirlang.html#langcodes
42 # http://blogs.law.harvard.edu/tech/stories/storyReader$15
43 LANGUAGE_CODE = 'en-us'
44
45 # This should match the ID of the Site object in the database. This is used to
46 # figure out URLs to stick in e-mails and related pages.
47 SITE_ID = 1
48
49 # The prefix for e-mail subjects sent to administrators.
50 EMAIL_SUBJECT_PREFIX = "[Review Board] "
51
52 # Whether to allow for smart spoofing of From addresses for e-mails.
53 #
54 # If enabled (default), DMARC records will be looked up before determining
55 # whether to use the user's e-mail address as the From address.
56 #
57 # If disabled, the old, dumb approach of assuming we can spoof will be used.
58 EMAIL_ENABLE_SMART_SPOOFING = True
59
60 # Default name of the service used in From e-mail when not spoofing.
61 #
62 # This should generally not be overridden unless one needs to thoroughly
63 # distinguish between two different Review Board servers AND DMARC is causing
64 # issues for e-mails.
65 EMAIL_DEFAULT_SENDER_SERVICE_NAME = 'Review Board'
66
67 # If you set this to False, Django will make some optimizations so as not
68 # to load the internationalization machinery.
69 USE_I18N = True
70
```
此时ReviewBoard登陆界面如下:

## 修改配置
修改Django配置:
```py
TIME_ZONE = 'Asia/Shanghai' # 设置时区为"亚洲/上海"
USE_TZ = True # 使用时区
```
修改ReviewBoard配置:
```sh
[root@helloreview reviewboard]# cat -n /usr/lib64/python2.7/site-packages/reviewboard/settings.py|sed -n '29,80p'
29 # Time zone support. If enabled, Django stores date and time information as
30 # UTC in the database, uses time zone-aware datetime objects, and translates
31 # them to the user's time zone in templates and forms.
32 USE_TZ = True
33
34 # Local time zone for this installation. All choices can be found here:
35 # http://www.postgresql.org/docs/8.1/static/datetime-keywords.html#DATETIME-TIMEZONE-SET-TABLE
36 # When USE_TZ is enabled, this is used as the default time zone for datetime
37 # objects
38 TIME_ZONE = 'Asia/Shanghai' #<-------------- 此行被修改
39
40 # Language code for this installation. All choices can be found here:
41 # http://www.w3.org/TR/REC-html40/struct/dirlang.html#langcodes
42 # http://blogs.law.harvard.edu/tech/stories/storyReader$15
43 LANGUAGE_CODE = 'zh-CN' # en-us,zh-TW,zh-CN #<-------------- 此行被修改
44
45 gettext_noop = lambda s: s #<-------------- 此行被增加
46 LANGUAGES = ( #<-------------- 此行被增加
47 ('zh-cn', gettext_noop('Simplified Chinese')), #<-------------- 此行被增加
48 #('zh-tw', gettext_noop('Traditional Chinese')), #<-------------- 此行被增加
49 ) #<-------------- 此行被增加
50
51 # This should match the ID of the Site object in the database. This is used to
52 # figure out URLs to stick in e-mails and related pages.
53 SITE_ID = 1
54
55 # The prefix for e-mail subjects sent to administrators.
56 EMAIL_SUBJECT_PREFIX = "[Review Board] "
57
58 # Whether to allow for smart spoofing of From addresses for e-mails.
59 #
60 # If enabled (default), DMARC records will be looked up before determining
61 # whether to use the user's e-mail address as the From address.
62 #
63 # If disabled, the old, dumb approach of assuming we can spoof will be used.
64 EMAIL_ENABLE_SMART_SPOOFING = True
65
66 # Default name of the service used in From e-mail when not spoofing.
67 #
68 # This should generally not be overridden unless one needs to thoroughly
69 # distinguish between two different Review Board servers AND DMARC is causing
70 # issues for e-mails.
71 EMAIL_DEFAULT_SENDER_SERVICE_NAME = 'Review Board'
72
73 # If you set this to False, Django will make some optimizations so as not
74 # to load the internationalization machinery.
75 USE_I18N = True
76 BASE_DIR = os.path.dirname(os.path.dirname(__file__)) #<-------------- 此行被增加
77 LOCALE_PATHS = ( #<-------------- 此行被增加
78 os.path.join(BASE_DIR, 'locale') #<-------------- 此行被增加
79 ) #<-------------- 此行被增加
80
```
解释:
- `TIME_ZONE = 'Asia/Shanghai'`设置时区为"亚洲/上海"。
- `LANGUAGE_CODE = 'zh-CN' # en-us,zh-TW,zh-CN`设置语言编码,使用中文简体编码。
- `gettext_noop = lambda s: s`增加国际化函数。
- `LANGUAGES = (('zh-cn', gettext_noop('Simplified Chinese')))`增加国际化中文简体支持。
- `LANGUAGES = (('zh-tw', gettext_noop('Traditional Chinese')))`增加国际化中文繁体支持,此行被注释。
- `LOCALE_PATHS = (os.path.join(BASE_DIR, 'locale'))`指定本定国际化翻译文件所在的目录。
## 复制翻译文件夹
复制已经存在的本地化配置文件夹zh_TW为zh_CN:
```sh
[root@helloreview ~]# cd /usr/lib64/python2.7/site-packages/reviewboard/locale
[root@helloreview locale]# cp -r zh_TW zh_CN
[root@helloreview locale]# ls -lah
total 4.0K
drwxr-xr-x 10 root root 106 Sep 8 11:03 .
drwxr-xr-x 29 root root 4.0K Sep 8 11:02 ..
drwxr-xr-x 3 root root 25 Aug 27 11:01 en
drwxr-xr-x 3 root root 25 Aug 27 11:01 es
drwxr-xr-x 3 root root 25 Aug 27 11:01 it_IT
drwxr-xr-x 3 root root 25 Aug 27 11:01 ko_KR
drwxr-xr-x 3 root root 25 Aug 27 11:01 pt_BR
drwxr-xr-x 3 root root 25 Sep 1 10:17 zh_CN
drwxr-xr-x 3 root root 25 Aug 27 11:01 zh_TW
[root@helloreview locale]# cd zh_CN/LC_MESSAGES
[root@helloreview LC_MESSAGES]# ls -lah
total 220K
drwxr-xr-x 2 root root 78 Sep 6 13:14 .
drwxr-xr-x 3 root root 25 Sep 1 10:17 ..
-rw-r--r-- 1 root root 13K Sep 8 10:39 djangojs.mo
-rw-r--r-- 1 root root 29K Sep 6 13:14 djangojs.po
-rw-r--r-- 1 root root 48K Sep 8 10:39 django.mo
-rw-r--r-- 1 root root 121K Sep 6 11:13 django.po
```
`django.po`和`djangojs.po`为汉化翻译文件,我们将ReviewBoard界面中的英文字符翻译成中文,翻译内容就在这两个文件中。
文件内容类似以下内容:
```sh
[root@helloreview LC_MESSAGES]# tail -11 django.po
#: templates/datagrids/hideable_listview.html:9
msgid "Show archived"
msgstr "显示归档的评审请求"
#: templates/datagrids/columns.py:716 templates/datagrids/columns.py:717
msgid "Ship It!/Issue Counts"
msgstr "评审通过/问题数量"
#: reviews/default_actions:200
msgid "Add General Comment"
msgstr "新增普通评论"
```
## 设置重命名
可以将以下几个常用命令加入到`.bashrc`中:
```sh
[root@helloreview ~]# echo "alias cdd='cd /usr/lib/python2.7/site-packages/django'" >> ~/.bashrc
[root@helloreview ~]# echo "alias cdr='cd /usr/lib64/python2.7/site-packages/reviewboard'" >> ~/.bashrc
[root@helloreview ~]# echo "alias cdrr='cd /var/www/html/reviewboard'" >> ~/.bashrc
[root@helloreview ~]# echo "alias rcc='pushd /usr/lib64/python2.7/site-packages/reviewboard && django-admin.py compilemessages && popd'" >> ~/.bashrc
[root@helloreview ~]# echo "alias rhttpd='systemctl restart httpd'" >> ~/.bashrc
```
重新加载个人配置:
```sh
[root@helloreview ~]# source ~/.bashrc
```
## 查找需要翻译的英文对应的文件
切换到ReviewBoard的app目录:
```sh
[root@helloreview ~]# cdr
[root@helloreview reviewboard]# pwd
/usr/lib64/python2.7/site-packages/reviewboard
```
查找需要翻译的英文对应的文件:
```sh
[root@helloreview reviewboard]# grep -Rn 'Log in to Review Board' * > ../a
[root@helloreview reviewboard]# vi ../a
1 Binary file locale/en/LC_MESSAGES/django.mo matches
2 locale/en/LC_MESSAGES/django.po:3098:msgid "Log in to Review Board"
3 Binary file locale/es/LC_MESSAGES/django.mo matches
4 locale/es/LC_MESSAGES/django.po:3108:msgid "Log in to Review Board"
5 Binary file locale/it_IT/LC_MESSAGES/django.mo matches
6 locale/it_IT/LC_MESSAGES/django.po:3113:msgid "Log in to Review Board"
7 locale/ko_KR/LC_MESSAGES/django.po:3098:msgid "Log in to Review Board"
8 locale/pt_BR/LC_MESSAGES/django.po:3114:msgid "Log in to Review Board"
9 Binary file locale/zh_TW/LC_MESSAGES/django.mo matches
10 locale/zh_TW/LC_MESSAGES/django.po:3138:msgid "Log in to Review Board"
11 Binary file locale/zh_CN/LC_MESSAGES/django.mo matches
12 locale/zh_CN/LC_MESSAGES/django.po:3211:msgid "Log in to Review Board"
13 templates/accounts/login.html:10: <h1>{% trans "Log in to Review Board" %}</h1>
```
就可以知道"Log in to Review Board"这段英文对应的文件是`templates/accounts/login.html`的10行的内容,我们看一下这个文件:
查看待翻译的原文件内容:
```sh
[root@helloreview reviewboard]# cat -n templates/accounts/login.html|sed -n '6,12p'
6 {% block auth_content %}
7 {% template_hook_point "before-login-form" %}
8
9 <div class="auth-header">
10 <h1>{% trans "Log in to Review Board" %}</h1>
11 {% if auth_backends.0.login_instructions %}
12 <p>{{auth_backends.0.login_instructions}}</p>
```
## 修改需要翻译的英文对应的文件
我们修改一下此处的内容,看看界面上面是否有变化,如将"Log in to Review Board"修改为"Log in to Review Board meizhaohui"。
修改待翻译的原文件,并重启Apache:
```sh
[root@helloreview reviewboard]# cat -n templates/accounts/login.html|sed -n '6,12p'
6 {% block auth_content %}
7 {% template_hook_point "before-login-form" %}
8
9 <div class="auth-header">
10 <h1>{% trans "Log in to Review Board meizhaohui" %}</h1>
11 {% if auth_backends.0.login_instructions %}
12 <p>{{auth_backends.0.login_instructions}}</p>
[root@helloreview reviewboard]# rhttpd
```
此时再刷新一下页面看一下,是否有变化。

刷新页面后,可以看到页面中出现了"Log in to Review Board meizhaohui",说明此处就是登陆页面对应的翻译原文件。
我们将`templates/accounts/login.html`文件还原成原始状态。即将第10行还原成`<h1>{% trans "Log in to Review Board" %}</h1>`。
查看还原后的待翻译的原文件内容:
```sh
[root@helloreview reviewboard]# cat -n templates/accounts/login.html|sed -n '6,12p'
6 {% block auth_content %}
7 {% template_hook_point "before-login-form" %}
8
9 <div class="auth-header">
10 <h1>{% trans "Log in to Review Board" %}</h1>
11 {% if auth_backends.0.login_instructions %}
12 <p>{{auth_backends.0.login_instructions}}</p>
```
## 增加翻译文本
我们在翻译文件中增加翻译内容:
```sh
[root@helloreview reviewboard]# cat -n locale/zh_CN/LC_MESSAGES/django.po|sed -n '3206,3216p'
3206 #: templates/accounts/login.html:4
3207 msgid "Log In"
3208 msgstr "登入"
3209
3210 #: templates/accounts/login.html:10
3211 msgid "Log in to Review Board"
3212 msgstr "登陆Review Board"
3213
3214 #: templates/accounts/login.html:40 templates/base/headerbar.html:32
3215 msgid "Log in"
3216 msgstr "登入"
```
## 编译生成mo文件
编译生成mo文件:
```sh
[root@helloreview reviewboard]# django-admin.py compilemessages
processing file djangojs.po in /usr/lib64/python2.7/site-packages/reviewboard/locale/en/LC_MESSAGES
processing file django.po in /usr/lib64/python2.7/site-packages/reviewboard/locale/en/LC_MESSAGES
processing file djangojs.po in /usr/lib64/python2.7/site-packages/reviewboard/locale/es/LC_MESSAGES
processing file django.po in /usr/lib64/python2.7/site-packages/reviewboard/locale/es/LC_MESSAGES
processing file django.po in /usr/lib64/python2.7/site-packages/reviewboard/locale/it_IT/LC_MESSAGES
processing file djangojs.po in /usr/lib64/python2.7/site-packages/reviewboard/locale/it_IT/LC_MESSAGES
processing file django.po in /usr/lib64/python2.7/site-packages/reviewboard/locale/ko_KR/LC_MESSAGES
processing file djangojs.po in /usr/lib64/python2.7/site-packages/reviewboard/locale/ko_KR/LC_MESSAGES
processing file django.po in /usr/lib64/python2.7/site-packages/reviewboard/locale/pt_BR/LC_MESSAGES
processing file djangojs.po in /usr/lib64/python2.7/site-packages/reviewboard/locale/pt_BR/LC_MESSAGES
processing file djangojs.po in /usr/lib64/python2.7/site-packages/reviewboard/locale/zh_TW/LC_MESSAGES
processing file django.po in /usr/lib64/python2.7/site-packages/reviewboard/locale/zh_TW/LC_MESSAGES
processing file djangojs.po in /usr/lib64/python2.7/site-packages/reviewboard/locale/zh_CN/LC_MESSAGES
processing file django.po in /usr/lib64/python2.7/site-packages/reviewboard/locale/zh_CN/LC_MESSAGES
```
利用快捷命令重新生成mo文件并重启Apache:
```sh
[root@helloreview reviewboard]# rcc && rhttpd
/usr/lib64/python2.7/site-packages/reviewboard /usr/lib64/python2.7/site-packages/reviewboard
processing file djangojs.po in /usr/lib64/python2.7/site-packages/reviewboard/locale/en/LC_MESSAGES
processing file django.po in /usr/lib64/python2.7/site-packages/reviewboard/locale/en/LC_MESSAGES
processing file djangojs.po in /usr/lib64/python2.7/site-packages/reviewboard/locale/es/LC_MESSAGES
processing file django.po in /usr/lib64/python2.7/site-packages/reviewboard/locale/es/LC_MESSAGES
processing file django.po in /usr/lib64/python2.7/site-packages/reviewboard/locale/it_IT/LC_MESSAGES
processing file djangojs.po in /usr/lib64/python2.7/site-packages/reviewboard/locale/it_IT/LC_MESSAGES
processing file django.po in /usr/lib64/python2.7/site-packages/reviewboard/locale/ko_KR/LC_MESSAGES
processing file djangojs.po in /usr/lib64/python2.7/site-packages/reviewboard/locale/ko_KR/LC_MESSAGES
processing file django.po in /usr/lib64/python2.7/site-packages/reviewboard/locale/pt_BR/LC_MESSAGES
processing file djangojs.po in /usr/lib64/python2.7/site-packages/reviewboard/locale/pt_BR/LC_MESSAGES
processing file djangojs.po in /usr/lib64/python2.7/site-packages/reviewboard/locale/zh_TW/LC_MESSAGES
processing file django.po in /usr/lib64/python2.7/site-packages/reviewboard/locale/zh_TW/LC_MESSAGES
processing file djangojs.po in /usr/lib64/python2.7/site-packages/reviewboard/locale/zh_CN/LC_MESSAGES
processing file django.po in /usr/lib64/python2.7/site-packages/reviewboard/locale/zh_CN/LC_MESSAGES
/usr/lib64/python2.7/site-packages/reviewboard
```
重新刷新ReviewBoard系统,可以看到中文翻译:

## 汉化注意事项
- `ReviewBoard 3.0.12`版本对应的python版本是`python2.7`,不要使用python3安装ReviewBoard。
- 使用grep查找需要翻译的英文对应的字符所在原文件时,请使用重定向将结果导入到reviewboard app的上级目录文件中,避免文件内容太多,导致刷屏。
参考文献:
- [language-identifiers ](http://www.i18nguy.com/unicode/language-identifiers.html)
| 42.232026 | 348 | 0.679099 | yue_Hant | 0.510326 |
0c83ad8e26aab01f47623eae93af1d3ebe2937ce | 51,614 | md | Markdown | Azure-RMSDocs/configure-office365.md | hyoshioka0128/Azure-RMSDocs.ja-jp | bca57a70d71c9e11fbe23c1130708884c6ca501e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | Azure-RMSDocs/configure-office365.md | hyoshioka0128/Azure-RMSDocs.ja-jp | bca57a70d71c9e11fbe23c1130708884c6ca501e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | Azure-RMSDocs/configure-office365.md | hyoshioka0128/Azure-RMSDocs.ja-jp | bca57a70d71c9e11fbe23c1130708884c6ca501e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Azure RMS の AIP を使用する Office 365 の構成
description: Azure Information Protection から Azure Rights Management サービスで使用する Office 365 を構成するための、管理者向けの情報と手順です。
author: cabailey
ms.author: cabailey
manager: barbkess
ms.date: 04/23/2019
ms.topic: conceptual
ms.collection: M365-security-compliance
ms.service: information-protection
ms.assetid: 0a6ce612-1b6b-4e21-b7fd-bcf79e492c3b
ms.reviewer: esaggese
ms.suite: ems
ms.openlocfilehash: c621ce93810243ce022433d860671a34e7d679c0
ms.sourcegitcommit: f9077101a974459a4252e763b5fafe51ff15a16f
ms.translationtype: MT
ms.contentlocale: ja-JP
ms.lasthandoff: 04/28/2019
ms.locfileid: "64767969"
---
# <a name="office365-configuration-for-clients-and-online-services-to-use-the-azure-rights-management-service"></a>Office 365:Azure Rights Management サービスを使用するようにクライアントとオンライン サービスを構成する
>*適用対象:[Azure Information Protection](https://azure.microsoft.com/pricing/details/information-protection)、[Office 365](https://download.microsoft.com/download/E/C/F/ECF42E71-4EC0-48FF-AA00-577AC14D5B5C/Azure_Information_Protection_licensing_datasheet_EN-US.pdf)*
Office 365 では、Azure Information Protection からの Azure Rights Management サービスをネイティブでサポートしているため、クライアント コンピューターを構成しなくても、Word、Excel、PowerPoint、Outlook、Outlook on the web などのアプリケーションで Information Rights Management (IRM) 機能がサポートされます。 ユーザーに求められることは、自分の Office アプリケーションに自分の Rights Management 資格情報でサインインすることだけです。 サインイン後、ファイルやメールを保護したり、他のユーザーが保護したファイルやメールを利用したりできます。
ただし、これらのアプリケーションを Azure Information Protection クライアントで補完して、ユーザーが Office アドインを利用し、追加のファイルの種類をサポートできるようにすることをお勧めします。 詳細については、「[Azure Information Protection クライアント:クライアントのインストールと構成](configure-client.md)」を参照してください。
## <a name="exchangeonline-irm-configuration"></a>Exchange Online:IRM 構成
Exchange Online IRM と Azure Rights Management サービスが連動するしくみについては、「[Office のアプリケーションとサービスが Azure Rights Management をサポートするしくみ](office-apps-services-support.md)」の「[Exchange Online と Exchange Server](office-apps-services-support.md#exchange-online-and-exchange-server)」セクションを参照してください。
Exchange Online で既に Azure Rights Management サービスの使用が有効になっている可能性があります。 これを確認するには、次のコマンドを実行します。
1. コンピューターで Exchange Online 用 Windows PowerShell を初めて使用する場合は、署名済みスクリプトを実行するように Windows PowerShell を構成する必要があります。 **[管理者として実行]** オプションを使用して Windows PowerShell セッションを開始し、次のように入力します。
Set-ExecutionPolicy RemoteSigned
**Y** を押して確認します。
2. Windows PowerShell セッションで、リモート シェル アクセスが有効になっているアカウントを使用して Exchange Online にサインインします。 既定では、Exchange Online で作成されたすべてのアカウントではリモート シェル アクセスが有効になっていますが、[Set-User <ユーザー ID> -RemotePowerShellEnabled](https://technet.microsoft.com/library/jj984292%28v=exchg.160%29.aspx) コマンドを使用することで無効 (または有効) にすることができます。
サインインするには、最初に次のように入力します。
$Cred = Get-Credential
次に **[Windows PowerShell 資格情報要求]** ダイアログ ボックスで、Office 365 のユーザー名とパスワードを入力します。
3. 最初に次のように変数を設定して、Exchange Online サービスに接続します。
$Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://ps.outlook.com/powershell/ -Credential $Cred -Authentication Basic –AllowRedirection
次に、
Import-PSSession $Session
4. [Get-IRMConfiguration](https://technet.microsoft.com/library/dd776120(v=exchg.160).aspx) コマンドを実行して、保護サービスでお使いの Exchange Online 構成を表示します。
Get-IRMConfiguration
出力で、**AzureRMSLicensingEnabled** の値を探します。
- AzureRMSLicensingEnabled が **True** に設定されている場合は、Azure Rights Management サービスに対して Exchange Online が既に有効になっています。
- AzureRMSLicensingEnabled が **False** に設定されている場合は、Azure Rights Management サービスに対して Exchange Online を有効にするコマンド `Set-IRMConfiguration -AzureRMSLicensingEnabled $true` を実行します
5. Exchange Online が正しく構成されているかどうかをテストするには、次のコマンドを実行します。
```
Test-IRMConfiguration -Sender <user email address>
```
以下に例を示します。<strong>Test-IRMConfiguration -Sender [email protected]</strong>
このコマンドは、サービスへの接続の確認、構成の取得、URI、ライセンス、および任意のテンプレートの取得を含む一連のチェックを実行します。 Windows PowerShell セッションですべてこれらのチェックに合格する場合に、最後に、それぞれの結果が表示されます。**全体的な結果: 合格**
Azure Rights Management サービスを使用するように Exchange Online を有効にすると、情報保護を自動的に適用する機能を構成できます。[メール フロー ルール](https://support.office.com/article/define-mail-flow-rules-to-encrypt-email-messages-in-office-365-9b7daf19-d5f2-415b-bc43-a0f5f4a585e8)、[データ損失防止 (DLP) ポリシー](https://technet.microsoft.com/library/jj150527%28v=exchg.150%29.aspx)、[保護されたボイス メール](https://technet.microsoft.com/library/dn198211%28v=exchg.150%29.aspx) (ユニファイド メッセージング) などです。
## <a name="sharepointonline-and-onedrive-for-business-irm-configuration"></a>SharePoint Online と OneDrive for Business:IRM 構成
SharePoint Online IRM と Azure Rights Management サービスが連動するしくみについては、このドキュメントの「**Rights Management の保護**」セクションの「[SharePoint Online と SharePoint Server](office-apps-services-support.md#sharepoint-online-and-sharepoint-server)」を参照してください。
Azure Rights Management サービスをサポートするように SharePoint Online と OneDrive for Business を構成するには、最初に SharePoint 管理センターを使用して、SharePoint Online の Information Rights Management (IRM) サービスを有効にする必要があります。 これで、サイトの所有者は SharePoint リストとドキュメント ライブラリを IRM で保護でき、ユーザーは OneDrive for Business ライブラリを IRM で保護することができます。それらの場所に保存されたドキュメント、および他のユーザーと共有しているドキュメントが、自動的に Azure Rights Management サービスで保護されるようになります。
> [!NOTE]
> SharePoint と OneDrive for Business の IRM で保護されたライブラリには、新しい OneDrive 同期クライアントの最新版 (OneDrive.exe) と、このバージョンの [Microsoft ダウンロード センターの RMS クライアント](https://www.microsoft.com/en-us/download/details.aspx?id=38396)が必要です。 Azure Information Protection クライアントをインストールした場合でも、このバージョンの RMS クライアントをインストールします。 このデプロイ シナリオについて詳しくは、「[エンタープライズ環境に新しい OneDrive 同期クライアントを展開する](https://support.office.com/article/Deploy-the-new-OneDrive-sync-client-in-an-enterprise-environment-3f3a511c-30c6-404a-98bf-76f95c519668)」をご覧ください。
SharePoint Online 用の Information Rights Management (IRM) サービスを有効にするには、Office ドキュメントの次の手順を参照してください。
- [SharePoint 管理センターにおける Information Rights Management (IRM) の設定](/office365/securitycompliance/set-up-irm-in-sp-admin-center)
この構成は、Office 365 管理者によって行われます。
### <a name="configuring-irm-for-libraries-and-lists"></a>ライブラリとリストの IRM を構成する
SharePoint の IRM サービスを有効にしたら、サイト所有者は SharePoint ドキュメント ライブラリとリストを IRM で保護することができます。 方法については、Office Web サイトの次の内容を参照してください。
- [Information Rights Management をリストまたはライブラリに適用する](https://office.microsoft.com/sharepoint-help/apply-information-rights-management-to-a-list-or-library-HA102891460.aspx)
この構成は、SharePoint サイト管理者によって行われます。
### <a name="configuring-irm-for-onedrive-for-business"></a>OneDrive for Business の IRM を構成する
SharePoint Online の IRM サービスを有効にした後、ユーザーの OneDrive for Business ドキュメント ライブラリまたは個別フォルダーを Rights Management 保護用に構成できます。 ユーザーは、OneDrive の Web サイトから自分でこれを構成できます。 管理者は SharePoint 管理センターを使用してユーザーにこの保護を構成することはできませんが、Windows PowerShell を使用してこれを行うことができます。
> [!NOTE]
> OneDrive for Business の構成について詳しくは、Office のドキュメント「[Office 365 での OneDrive for Business のセットアップ](https://support.office.com/article/Set-up-OneDrive-for-Business-in-Office-365-3e21f8f0-e0a1-43be-aa3e-8c0236bf11bb)」を参照してください。
#### <a name="configuration-for-users"></a>ユーザー用の構成
ユーザーが自分の OneDrive for Business を構成してビジネス ファイルを保護できるように、ユーザーに次のことを指示します。
1. 職場または学校のアカウントで Office 365 にサインインし、[OneDrive の Web サイト](https://admin.microsoft.com/onedrive)に移動します。
2. ナビゲーション ウィンドウの下部の、**[従来の OneDrive に戻す]** を選択します。
3. **[設定]** アイコンを選択します。 **[設定]** ウィンドウの**リボン**が **[オフ]** に設定されている場合、この設定を選択し、リボンをオンにします。
4. OneDrive for Business のすべてのファイルが保護されるよう構成するには、リボンの **[ライブラリ]** タブを選択し、**[ライブラリの設定]** を選択します。
5. **[ドキュメント > 設定]** ページの **[権限と管理]** セクションの **[Information Rights Management]** を選択します。
6. **[Information Rights Management 設定]** ページで **[ダウンロード時にこのライブラリへの権限を制限する]** チェック ボックスをオンにします。 このアクセス許可に名前を付け、説明を指定し、オプションで **[オプションの表示]** をクリックしてオプションを構成し、**[OK]** をクリックします。
構成オプションの詳細については、Office ドキュメントの「 [Information Rights Management をリストまたはライブラリに適用する](https://support.office.com/article/Apply-Information-Rights-Management-to-a-list-or-library-3bdb5c4e-94fc-4741-b02f-4e7cc3c54aa1) 」の手順を参照してください。
この構成は管理者ではなくユーザーが自分で OneDrive for Business ファイルを IRM で保護する必要があるため、ファイルを保護する利点とその方法について、ユーザーを教育します。 たとえば、OneDrive for Business でドキュメントを共有すると、ファイルの名前が変更され、別の場所にコピーされていても、権限があるユーザーに限り、構成した制限付きでアクセスできることを説明します。
#### <a name="configuration-for-administrators"></a>管理者用の構成
管理者は SharePoint 管理センターを使用してユーザーの OneDrive for Business 用に IRM を構成することはできませんが、Windows PowerShell を使用してこれを行うことができます。 これらのライブラリに対して IRM を有効にするには、次の手順を実行します。
1. [SharePoint Online クライアント コンポーネント SDK](https://www.microsoft.com/en-us/download/details.aspx?id=42038) をダウンロードしてインストールします。
2. [SharePoint Online 管理シェル](https://www.microsoft.com/en-us/download/details.aspx?id=35588)をダウンロードしてインストールします。
3. 次のスクリプトの内容をコピーし、Set-IRMOnOneDriveForBusiness.ps1 という名前のファイルでコンピューターに保存します。
***免責事項***: このサンプル スクリプトは、Microsoft の標準サポート プログラムまたはサービスでサポートされていません。 このサンプル スクリプトは、どのような種類の保証も伴わずそのままの状態で提供されます。
```
# Requires Windows PowerShell version 3
<#
Description:
Configures IRM policy settings for OneDrive for Business and can also be used for SharePoint Online libraries and lists
Script Installation Requirements:
SharePoint Online Client Components SDK
https://www.microsoft.com/en-us/download/details.aspx?id=42038
SharePoint Online Management Shell
https://www.microsoft.com/en-us/download/details.aspx?id=35588
======
#>
# URL will be in the format https://<tenant-name>-admin.sharepoint.com
$sharepointAdminCenterUrl = "https://contoso-admin.sharepoint.com"
$tenantAdmin = "[email protected]"
$webUrls = @("https://contoso-my.sharepoint.com/personal/user1_contoso_com",
"https://contoso-my.sharepoint.com/personal/user2_contoso_com",
"https://contoso-my.sharepoint.com/personal/user3_contoso_com")
<# As an alternative to specifying the URLs as an array, you can import them from a CSV file (no header, single value per row).
Then, use: $webUrls = Get-Content -Path "File_path_and_name.csv"
#>
$listTitle = "Documents"
function Load-SharePointOnlineClientComponentAssemblies
{
[cmdletbinding()]
param()
process
{
# assembly location: C:\Program Files\Common Files\microsoft shared\Web Server Extensions\16\ISAPI
try
{
Write-Verbose "Loading Assembly: Microsoft.Office.Client.Policy, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c"
[System.Reflection.Assembly]::Load("Microsoft.Office.Client.Policy, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c") | Out-Null
Write-Verbose "Loading Assembly: Microsoft.Office.Client.TranslationServices, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c"
[System.Reflection.Assembly]::Load("Microsoft.Office.Client.TranslationServices, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c") | Out-Null
Write-Verbose "Loading Assembly: Microsoft.SharePoint.Client, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c"
[System.Reflection.Assembly]::Load("Microsoft.SharePoint.Client, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c") | Out-Null
Write-Verbose "Loading Assembly: Microsoft.SharePoint.Client.DocumentManagement, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c"
[System.Reflection.Assembly]::Load("Microsoft.SharePoint.Client.DocumentManagement, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c") | Out-Null
Write-Verbose "Loading Assembly: Microsoft.SharePoint.Client.Publishing, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c"
[System.Reflection.Assembly]::Load("Microsoft.SharePoint.Client.Publishing, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c") | Out-Null
Write-Verbose "Loading Assembly: Microsoft.SharePoint.Client.Runtime, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c"
[System.Reflection.Assembly]::Load("Microsoft.SharePoint.Client.Runtime, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c") | Out-Null
Write-Verbose "Loading Assembly: Microsoft.SharePoint.Client.Search.Applications, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c"
[System.Reflection.Assembly]::Load("Microsoft.SharePoint.Client.Search.Applications, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c") | Out-Null
Write-Verbose "Loading Assembly: Microsoft.SharePoint.Client.Search, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c"
[System.Reflection.Assembly]::Load("Microsoft.SharePoint.Client.Search, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c") | Out-Null
Write-Verbose "Loading Assembly: Microsoft.SharePoint.Client.Taxonomy, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c"
[System.Reflection.Assembly]::Load("Microsoft.SharePoint.Client.Taxonomy, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c") | Out-Null
Write-Verbose "Loading Assembly: Microsoft.SharePoint.Client.UserProfiles, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c"
[System.Reflection.Assembly]::Load("Microsoft.SharePoint.Client.UserProfiles, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c") | Out-Null
return $true
}
catch
{
if($_.Exception.Message -match "Could not load file or assembly")
{
Write-Error -Message "Unable to load the SharePoint Server 2013 Client Components.`nDownload Location: https://www.microsoft.com/en-us/download/details.aspx?id=42038"
}
else
{
Write-Error -Exception $_.Exception
}
return $false
}
}
}
function Load-SharePointOnlineModule
{
[cmdletbinding()]
param()
process
{
do
{
# Installation location: C:\Program Files\SharePoint Online Management Shell\Microsoft.Online.SharePoint.PowerShell
$spoModule = Get-Module -Name Microsoft.Online.SharePoint.PowerShell -ErrorAction SilentlyContinue
if(-not $spoModule)
{
try
{
Import-Module Microsoft.Online.SharePoint.PowerShell -DisableNameChecking
return $true
}
catch
{
if($_.Exception.Message -match "Could not load file or assembly")
{
Write-Error -Message "Unable to load the SharePoint Online Management Shell.`nDownload Location: https://www.microsoft.com/en-us/download/details.aspx?id=35588"
}
else
{
Write-Error -Exception $_.Exception
}
return $false
}
}
else
{
return $true
}
}
while(-not $spoModule)
}
}
function Set-IrmConfiguration
{
[cmdletbinding()]
param(
[parameter(Mandatory=$true)][Microsoft.SharePoint.Client.List]$List,
[parameter(Mandatory=$true)][string]$PolicyTitle,
[parameter(Mandatory=$true)][string]$PolicyDescription,
[parameter(Mandatory=$false)][switch]$IrmReject,
[parameter(Mandatory=$false)][DateTime]$ProtectionExpirationDate,
[parameter(Mandatory=$false)][switch]$DisableDocumentBrowserView,
[parameter(Mandatory=$false)][switch]$AllowPrint,
[parameter(Mandatory=$false)][switch]$AllowScript,
[parameter(Mandatory=$false)][switch]$AllowWriteCopy,
[parameter(Mandatory=$false)][int]$DocumentAccessExpireDays,
[parameter(Mandatory=$false)][int]$LicenseCacheExpireDays,
[parameter(Mandatory=$false)][string]$GroupName
)
process
{
Write-Verbose "Applying IRM Configuration on '$($List.Title)'"
# reset the value to the default settings
$list.InformationRightsManagementSettings.Reset()
$list.IrmEnabled = $true
# IRM Policy title and description
$list.InformationRightsManagementSettings.PolicyTitle = $PolicyTitle
$list.InformationRightsManagementSettings.PolicyDescription = $PolicyDescription
# Set additional IRM library settings
# Do not allow users to upload documents that do not support IRM
$list.IrmReject = $IrmReject.IsPresent
$parsedDate = Get-Date
if([DateTime]::TryParse($ProtectionExpirationDate, [ref]$parsedDate))
{
# Stop restricting access to the library at <date>
$list.IrmExpire = $true
$list.InformationRightsManagementSettings.DocumentLibraryProtectionExpireDate = $ProtectionExpirationDate
}
# Prevent opening documents in the browser for this Document Library
$list.InformationRightsManagementSettings.DisableDocumentBrowserView = $DisableDocumentBrowserView.IsPresent
# Configure document access rights
# Allow viewers to print
$list.InformationRightsManagementSettings.AllowPrint = $AllowPrint.IsPresent
# Allow viewers to run script and screen reader to function on downloaded documents
$list.InformationRightsManagementSettings.AllowScript = $AllowScript.IsPresent
# Allow viewers to write on a copy of the downloaded document
$list.InformationRightsManagementSettings.AllowWriteCopy = $AllowWriteCopy.IsPresent
if($DocumentAccessExpireDays)
{
# After download, document access rights will expire after these number of days (1-365)
$list.InformationRightsManagementSettings.EnableDocumentAccessExpire = $true
$list.InformationRightsManagementSettings.DocumentAccessExpireDays = $DocumentAccessExpireDays
}
# Set group protection and credentials interval
if($LicenseCacheExpireDays)
{
# Users must verify their credentials using this interval (days)
$list.InformationRightsManagementSettings.EnableLicenseCacheExpire = $true
$list.InformationRightsManagementSettings.LicenseCacheExpireDays = $LicenseCacheExpireDays
}
if($GroupName)
{
# Allow group protection. Default group:
$list.InformationRightsManagementSettings.EnableGroupProtection = $true
$list.InformationRightsManagementSettings.GroupName = $GroupName
}
}
end
{
if($list)
{
Write-Verbose "Committing IRM configuration settings on '$($list.Title)'"
$list.InformationRightsManagementSettings.Update()
$list.Update()
$script:clientContext.Load($list)
$script:clientContext.ExecuteQuery()
}
}
}
function Get-CredentialFromCredentialCache
{
[cmdletbinding()]
param([string]$CredentialName)
#if( Test-Path variable:\global:CredentialCache )
if( Get-Variable O365TenantAdminCredentialCache -Scope Global -ErrorAction SilentlyContinue )
{
if($global:O365TenantAdminCredentialCache.ContainsKey($CredentialName))
{
Write-Verbose "Credential Cache Hit: $CredentialName"
return $global:O365TenantAdminCredentialCache[$CredentialName]
}
}
Write-Verbose "Credential Cache Miss: $CredentialName"
return $null
}
function Add-CredentialToCredentialCache
{
[cmdletbinding()]
param([System.Management.Automation.PSCredential]$Credential)
if(-not (Get-Variable CredentialCache -Scope Global -ErrorAction SilentlyContinue))
{
Write-Verbose "Initializing the Credential Cache"
$global:O365TenantAdminCredentialCache = @{}
}
Write-Verbose "Adding Credential to the Credential Cache"
$global:O365TenantAdminCredentialCache[$Credential.UserName] = $Credential
}
# load the required assemblies and Windows PowerShell modules
if(-not ((Load-SharePointOnlineClientComponentAssemblies) -and (Load-SharePointOnlineModule)) ) { return }
# Add the credentials to the client context and SharePoint Online service connection
# check for cached credentials to use
$o365TenantAdminCredential = Get-CredentialFromCredentialCache -CredentialName $tenantAdmin
if(-not $o365TenantAdminCredential)
{
# when credentials are not cached, prompt for the tenant admin credentials
$o365TenantAdminCredential = Get-Credential -UserName $tenantAdmin -Message "Enter the password for the Office 365 admin"
if(-not $o365TenantAdminCredential -or -not $o365TenantAdminCredential.UserName -or $o365TenantAdminCredential.Password.Length -eq 0 )
{
Write-Error -Message "Could not validate the supplied tenant admin credentials"
return
}
# add the credentials to the cache
Add-CredentialToCredentialCache -Credential $o365TenantAdminCredential
}
# connect to Office365 first, required for SharePoint Online cmdlets to run
Connect-SPOService -Url $sharepointAdminCenterUrl -Credential $o365TenantAdminCredential
# enumerate each of the specified site URLs
foreach($webUrl in $webUrls)
{
$grantedSiteCollectionAdmin = $false
try
{
# establish the client context and set the credentials to connect to the site
$script:clientContext = New-Object Microsoft.SharePoint.Client.ClientContext($webUrl)
$script:clientContext.Credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($o365TenantAdminCredential.UserName, $o365TenantAdminCredential.Password)
# initialize the site and web context
$script:clientContext.Load($script:clientContext.Site)
$script:clientContext.Load($script:clientContext.Web)
$script:clientContext.ExecuteQuery()
# load and ensure the tenant admin user account if present on the target SharePoint site
$tenantAdminUser = $script:clientContext.Web.EnsureUser($o365TenantAdminCredential.UserName)
$script:clientContext.Load($tenantAdminUser)
$script:clientContext.ExecuteQuery()
# check if the tenant admin is a site admin
if( -not $tenantAdminUser.IsSiteAdmin )
{
try
{
# grant the tenant admin temporary admin rights to the site collection
Set-SPOUser -Site $script:clientContext.Site.Url -LoginName $o365TenantAdminCredential.UserName -IsSiteCollectionAdmin $true | Out-Null
$grantedSiteCollectionAdmin = $true
}
catch
{
Write-Error $_.Exception
return
}
}
try
{
# load the list orlibrary using CSOM
$list = $null
$list = $script:clientContext.Web.Lists.GetByTitle($listTitle)
$script:clientContext.Load($list)
$script:clientContext.ExecuteQuery()
# ************** ADMIN INSTRUCTIONS **************
# If necessary, modify the following Set-IrmConfiguration parameters to match your required values
# The supplied options and values are for example only
# Example that shows the Set-IrmConfiguration command with all parameters: Set-IrmConfiguration -List $list -PolicyTitle "Protected Files" -PolicyDescription "This policy restricts access to authorized users" -IrmReject -ProtectionExpirationDate $(Get-Date).AddDays(180) -DisableDocumentBrowserView -AllowPrint -AllowScript -AllowWriteCopy -LicenseCacheExpireDays 25 -DocumentAccessExpireDays 90
Set-IrmConfiguration -List $list -PolicyTitle "Protected Files" -PolicyDescription "This policy restricts access to authorized users"
}
catch
{
Write-Error -Message "Error setting IRM configuration on site: $webUrl.`nError Details: $($_.Exception.ToString())"
}
}
finally
{
if($grantedSiteCollectionAdmin)
{
# remove the temporary admin rights to the site collection
Set-SPOUser -Site $script:clientContext.Site.Url -LoginName $o365TenantAdminCredential.UserName -IsSiteCollectionAdmin $false | Out-Null
}
}
}
Disconnect-SPOService -ErrorAction SilentlyContinue
```
4. スクリプトを確認し、次のように変更します。
1. `$sharepointAdminCenterUrl` を探し、例の値を実際の SharePoint 管理センターの URL に置き換えます。
SharePoint 管理センターでこの値をベース URL として探すことになります。形式は、 https://<em><テナント名></em>-admin.sharepoint.com です。
たとえば、テナント名が "contoso" の場合、**https://contoso-admin.sharepoint.com** と指定します。
2. `$tenantAdmin` を探し、例の値を Office 365 の実際の完全修飾グローバル管理者アカウントに置き換えます。
この値は、グローバル管理者として Microsoft 365 管理センターにサインインするために使用するものと同じで、形式は user_name@*<テナント ドメイン名>*.com です。
たとえば、"contoso.com" テナント ドメインの Office 365 グローバル管理者ユーザーの名前が "admin" である場合、<strong>[email protected]</strong> と指定します。
3. `$webUrls` を探し、例の値をユーザーの OneDrive for Business Web URL に置き換え、必要に応じてエントリを追加または削除します。
または、スクリプトのコメントを参考にして、構成する必要のあるすべての URL を含む .CSV ファイルをインポートして、この配列を置き換えます。 自動的に URL を検索して抽出し、この .CSV ファイルを作成する、別のサンプル スクリプトが提供されています。 これを行う準備ができたら、これらの手順を実行した直後に「[すべての OneDrive for Business URL を .CSV ファイルに出力するための追加スクリプト](#additional-script-to-output-all-onedrive-for-business-urls-to-a-csv-file)」セクションの手順を実行します。
ユーザーの OneDrive for Business の Web URL の形式は、 https://<em><テナント名></em>-my.sharepoint.com/personal/*<ユーザー名>*_*<テナント名>*_com です。
たとえば、contoso テナントのユーザーのユーザー名が "rsimone" の場合、**https://contoso-my.sharepoint.com/personal/rsimone_contoso_com** と指定します。
4. スクリプトを使用して OneDrive for Business を構成しているので、`$listTitle` 変数の **ドキュメント** の値は変更しないでください。
5. `ADMIN INSTRUCTIONS` を探します。 このセクションを変更しないと、ユーザーの OneDrive for Business はポリシーのタイトル "Protected Files"、説明 "This policy restricts access to authorized users" で IRM 用に構成されます。 その他の IRM オプションは設定されません、おそらくほとんどの環境に最適です。 ただし、推奨されているポリシーのタイトルと説明を変更でき、環境に合わせて他の IRM オプションも追加できます。 Set-IrmConfiguration コマンドの独自のパラメーター セットの作成については、スクリプトのコメント付きの例を参照してください。
5. スクリプトを保存し、署名します。 スクリプトに署名しない場合は (より安全)、署名されていないスクリプトを実行するようにコンピューターで Windows PowerShell を構成する必要があります。 そのためには、**[管理者として実行]** オプションを使用して Windows PowerShell セッションを実行し、「**Set-ExecutionPolicy Unrestricted**」と入力します。 ただし、この構成を使用すると署名されていないすべてのスクリプトを実行できます (セキュリティが低下) 。
Windows PowerShell スクリプトへの署名の詳細については、PowerShell のドキュメント ライブラリの「 [about_Signing](https://technet.microsoft.com/library/hh847874.aspx) 」を参照してください。
6. スクリプトを実行し、要求された場合は、Office 365 管理者アカウントのパスワードを入力します。 スクリプトを変更して同じ Windows PowerShell セッションで実行する場合は、資格情報を要求されません。
> [!TIP]
> このスクリプトを使用して、SharePoint Online ライブラリ用に IRM を構成することもできます。 この構成では、追加オプション **[IRM をサポートしないドキュメントのアップロードをユーザーに許可しない]** を有効にして、保護されたドキュメントだけがライブラリに含まれるようにできます。 そのためには、`-IrmReject` パラメーターをスクリプトの Set-IrmConfiguration コマンドに追加します。
>
> また、`$webUrls` 変数 (例: **https://contoso.sharepoint.com**) および `$listTitle` 変数 (例: **$Reports**) を変更する必要もあります。
ユーザーの OneDrive for Business ライブラリに対して IRM を無効にする必要がある場合、「[OneDrive for Business の IRM を無効にするスクリプト](#script-to-disable-irm-for-onedrive-for-business)」セクションを参照してください。
##### <a name="additional-script-to-output-all-onedrive-for-business-urls-to-a-csv-file"></a>すべての OneDrive for Business URL を .CSV ファイルに出力するための追加スクリプト
上の手順 4c では、次の Windows PowerShell スクリプトを使用してすべてのユーザーの OneDrive for Business ライブラリの URL を抽出できます。その後、それを確認し、必要であれば編集して、メイン スクリプトにインポートできます。
また、このスクリプトでは、[SharePoint Online クライアント コンポーネント SDK](https://www.microsoft.com/en-us/download/details.aspx?id=42038) および [SharePoint Online 管理シェル](https://www.microsoft.com/en-us/download/details.aspx?id=35588) も必要です。 同じ手順でコピーして貼り付け、ファイルをローカルに保存し (例: "Report-OneDriveForBusinessSiteInfo.ps1")、前と同じように `$sharepointAdminCenterUrl` および `$tenantAdmin` の値を変更して、スクリプトを実行します。
***免責事項***: このサンプル スクリプトは、Microsoft の標準サポート プログラムまたはサービスでサポートされていません。 このサンプル スクリプトは、どのような種類の保証も伴わずそのままの状態で提供されます。
```
# Requires Windows PowerShell version 3
<#
Description:
Queries the search service of an Office 365 tenant to retrieve all OneDrive for Business sites.
Details of the discovered sites are written to a .CSV file (by default,"OneDriveForBusinessSiteInfo_<date>.csv").
Script Installation Requirements:
SharePoint Online Client Components SDK
https://www.microsoft.com/en-us/download/details.aspx?id=42038
SharePoint Online Management Shell
https://www.microsoft.com/en-us/download/details.aspx?id=35588
======
#>
# URL will be in the format https://<tenant-name>-admin.sharepoint.com
$sharepointAdminCenterUrl = "https://contoso-admin.sharepoint.com"
$tenantAdmin = "[email protected]"
$reportName = "OneDriveForBusinessSiteInfo_$((Get-Date).ToString("yyyy-MM-dd_hh.mm.ss")).csv"
$oneDriveForBusinessSiteUrls= @()
$resultsProcessed = 0
function Load-SharePointOnlineClientComponentAssemblies
{
[cmdletbinding()]
param()
process
{
# assembly location: C:\Program Files\Common Files\microsoft shared\Web Server Extensions\16\ISAPI
try
{
Write-Verbose "Loading Assembly: Microsoft.Office.Client.Policy, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c"
[System.Reflection.Assembly]::Load("Microsoft.Office.Client.Policy, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c") | Out-Null
Write-Verbose "Loading Assembly: Microsoft.Office.Client.TranslationServices, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c"
[System.Reflection.Assembly]::Load("Microsoft.Office.Client.TranslationServices, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c") | Out-Null
Write-Verbose "Loading Assembly: Microsoft.SharePoint.Client, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c"
[System.Reflection.Assembly]::Load("Microsoft.SharePoint.Client, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c") | Out-Null
Write-Verbose "Loading Assembly: Microsoft.SharePoint.Client.DocumentManagement, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c"
[System.Reflection.Assembly]::Load("Microsoft.SharePoint.Client.DocumentManagement, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c") | Out-Null
Write-Verbose "Loading Assembly: Microsoft.SharePoint.Client.Publishing, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c"
[System.Reflection.Assembly]::Load("Microsoft.SharePoint.Client.Publishing, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c") | Out-Null
Write-Verbose "Loading Assembly: Microsoft.SharePoint.Client.Runtime, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c"
[System.Reflection.Assembly]::Load("Microsoft.SharePoint.Client.Runtime, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c") | Out-Null
Write-Verbose "Loading Assembly: Microsoft.SharePoint.Client.Search.Applications, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c"
[System.Reflection.Assembly]::Load("Microsoft.SharePoint.Client.Search.Applications, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c") | Out-Null
Write-Verbose "Loading Assembly: Microsoft.SharePoint.Client.Search, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c"
[System.Reflection.Assembly]::Load("Microsoft.SharePoint.Client.Search, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c") | Out-Null
Write-Verbose "Loading Assembly: Microsoft.SharePoint.Client.Taxonomy, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c"
[System.Reflection.Assembly]::Load("Microsoft.SharePoint.Client.Taxonomy, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c") | Out-Null
Write-Verbose "Loading Assembly: Microsoft.SharePoint.Client.UserProfiles, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c"
[System.Reflection.Assembly]::Load("Microsoft.SharePoint.Client.UserProfiles, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c") | Out-Null
return $true
}
catch
{
if($_.Exception.Message -match "Could not load file or assembly")
{
Write-Error -Message "Unable to load the SharePoint Server 2013 Client Components.`nDownload Location: https://www.microsoft.com/en-us/download/details.aspx?id=42038"
}
else
{
Write-Error -Exception $_.Exception
}
return $false
}
}
}
function Load-SharePointOnlineModule
{
[cmdletbinding()]
param()
process
{
do
{
# Installation location: C:\Program Files\SharePoint Online Management Shell\Microsoft.Online.SharePoint.PowerShell
$spoModule = Get-Module -Name Microsoft.Online.SharePoint.PowerShell -ErrorAction SilentlyContinue
if(-not $spoModule)
{
try
{
Import-Module Microsoft.Online.SharePoint.PowerShell -DisableNameChecking
return $true
}
catch
{
if($_.Exception.Message -match "Could not load file or assembly")
{
Write-Error -Message "Unable to load the SharePoint Online Management Shell.`nDownload Location: https://www.microsoft.com/en-us/download/details.aspx?id=35588"
}
else
{
Write-Error -Exception $_.Exception
}
return $false
}
}
else
{
return $true
}
}
while(-not $spoModule)
}
}
function Get-CredentialFromCredentialCache
{
[cmdletbinding()]
param([string]$CredentialName)
#if( Test-Path variable:\global:CredentialCache )
if( Get-Variable O365TenantAdminCredentialCache -Scope Global -ErrorAction SilentlyContinue )
{
if($global:O365TenantAdminCredentialCache.ContainsKey($CredentialName))
{
Write-Verbose "Credential Cache Hit: $CredentialName"
return $global:O365TenantAdminCredentialCache[$CredentialName]
}
}
Write-Verbose "Credential Cache Miss: $CredentialName"
return $null
}
function Add-CredentialToCredentialCache
{
[cmdletbinding()]
param([System.Management.Automation.PSCredential]$Credential)
if(-not (Get-Variable CredentialCache -Scope Global -ErrorAction SilentlyContinue))
{
Write-Verbose "Initializing the Credential Cache"
$global:O365TenantAdminCredentialCache = @{}
}
Write-Verbose "Adding Credential to the Credential Cache"
$global:O365TenantAdminCredentialCache[$Credential.UserName] = $Credential
}
# load the required assemblies and Windows PowerShell modules
if(-not ((Load-SharePointOnlineClientComponentAssemblies) -and (Load-SharePointOnlineModule)) ) { return }
# Add the credentials to the client context and SharePoint Online service connection
# check for cached credentials to use
$o365TenantAdminCredential = Get-CredentialFromCredentialCache -CredentialName $tenantAdmin
if(-not $o365TenantAdminCredential)
{
# when credentials are not cached, prompt for the tenant admin credentials
$o365TenantAdminCredential = Get-Credential -UserName $tenantAdmin -Message "Enter the password for the Office 365 admin"
if(-not $o365TenantAdminCredential -or -not $o365TenantAdminCredential.UserName -or $o365TenantAdminCredential.Password.Length -eq 0 )
{
Write-Error -Message "Could not validate the supplied tenant admin credentials"
return
}
# add the credentials to the cache
Add-CredentialToCredentialCache -Credential $o365TenantAdminCredential
}
# establish the client context and set the credentials to connect to the site
$clientContext = New-Object Microsoft.SharePoint.Client.ClientContext($sharepointAdminCenterUrl)
$clientContext.Credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($o365TenantAdminCredential.UserName, $o365TenantAdminCredential.Password)
# run a query against the Office 365 tenant search service to retrieve all OneDrive for Business URLs
do
{
# build the query object
$query = New-Object Microsoft.SharePoint.Client.Search.Query.KeywordQuery($clientContext)
$query.TrimDuplicates = $false
$query.RowLimit = 500
$query.QueryText = "SPSiteUrl:'/personal/' AND contentclass:STS_Site"
$query.StartRow = $resultsProcessed
$query.TotalRowsExactMinimum = 500000
# run the query
$searchExecutor = New-Object Microsoft.SharePoint.Client.Search.Query.SearchExecutor($clientContext)
$queryResults = $searchExecutor.ExecuteQuery($query)
$clientContext.ExecuteQuery()
# enumerate the search results and store the site URLs
$queryResults.Value[0].ResultRows | % {
$oneDriveForBusinessSiteUrls += $_.Path
$resultsProcessed++
}
}
while($resultsProcessed -lt $queryResults.Value.TotalRows)
$oneDriveForBusinessSiteUrls | Out-File -FilePath $reportName
```
##### <a name="script-to-disable-irm-for-onedrive-for-business"></a>OneDrive for Business の IRM を無効にするスクリプト
ユーザーの OneDrive for Business の IRM を無効にする必要がある場合は、次のサンプル スクリプトを使用します。
また、このスクリプトでは、[SharePoint Online クライアント コンポーネント SDK](https://www.microsoft.com/en-us/download/details.aspx?id=42038) および [SharePoint Online 管理シェル](https://www.microsoft.com/en-us/download/details.aspx?id=35588) も必要です。 内容をコピーして貼り付け、ファイルをローカルにコピーし (例: "Disable-IRMOnOneDriveForBusiness.ps1")、`$sharepointAdminCenterUrl` と `$tenantAdmin` の値を変更します。 OneDrive for Business の URL を手動で指定するか、または前のセクションのスクリプトを使用してインポートできるようにし、スクリプトを実行します。
***免責事項***: このサンプル スクリプトは、Microsoft の標準サポート プログラムまたはサービスでサポートされていません。 このサンプル スクリプトは、どのような種類の保証も伴わずそのままの状態で提供されます。
```
# Requires Windows PowerShell version 3
<#
Description:
Disables IRM for OneDrive for Business and can also be used for SharePoint Online libraries and lists
Script Installation Requirements:
SharePoint Online Client Components SDK
https://www.microsoft.com/en-us/download/details.aspx?id=42038
SharePoint Online Management Shell
https://www.microsoft.com/en-us/download/details.aspx?id=35588
======
#>
$sharepointAdminCenterUrl = "https://contoso-admin.sharepoint.com"
$tenantAdmin = "[email protected]"
$webUrls = @("https://contoso-my.sharepoint.com/personal/user1_contoso_com",
"https://contoso-my.sharepoint.com/personal/user2_contoso_com",
"https://contoso-my.sharepoint.com/personal/person3_contoso_com")
<# As an alternative to specifying the URLs as an array, you can import them from a CSV file (no header, single value per row).
Then, use: $webUrls = Get-Content -Path "File_path_and_name.csv"
#>
$listTitle = "Documents"
function Load-SharePointOnlineClientComponentAssemblies
{
[cmdletbinding()]
param()
process
{
# assembly location: C:\Program Files\Common Files\microsoft shared\Web Server Extensions\16\ISAPI
try
{
Write-Verbose "Loading Assembly: Microsoft.Office.Client.Policy, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c"
[System.Reflection.Assembly]::Load("Microsoft.Office.Client.Policy, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c") | Out-Null
Write-Verbose "Loading Assembly: Microsoft.Office.Client.TranslationServices, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c"
[System.Reflection.Assembly]::Load("Microsoft.Office.Client.TranslationServices, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c") | Out-Null
Write-Verbose "Loading Assembly: Microsoft.SharePoint.Client, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c"
[System.Reflection.Assembly]::Load("Microsoft.SharePoint.Client, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c") | Out-Null
Write-Verbose "Loading Assembly: Microsoft.SharePoint.Client.DocumentManagement, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c"
[System.Reflection.Assembly]::Load("Microsoft.SharePoint.Client.DocumentManagement, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c") | Out-Null
Write-Verbose "Loading Assembly: Microsoft.SharePoint.Client.Publishing, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c"
[System.Reflection.Assembly]::Load("Microsoft.SharePoint.Client.Publishing, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c") | Out-Null
Write-Verbose "Loading Assembly: Microsoft.SharePoint.Client.Runtime, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c"
[System.Reflection.Assembly]::Load("Microsoft.SharePoint.Client.Runtime, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c") | Out-Null
Write-Verbose "Loading Assembly: Microsoft.SharePoint.Client.Search.Applications, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c"
[System.Reflection.Assembly]::Load("Microsoft.SharePoint.Client.Search.Applications, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c") | Out-Null
Write-Verbose "Loading Assembly: Microsoft.SharePoint.Client.Search, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c"
[System.Reflection.Assembly]::Load("Microsoft.SharePoint.Client.Search, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c") | Out-Null
Write-Verbose "Loading Assembly: Microsoft.SharePoint.Client.Taxonomy, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c"
[System.Reflection.Assembly]::Load("Microsoft.SharePoint.Client.Taxonomy, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c") | Out-Null
Write-Verbose "Loading Assembly: Microsoft.SharePoint.Client.UserProfiles, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c"
[System.Reflection.Assembly]::Load("Microsoft.SharePoint.Client.UserProfiles, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c") | Out-Null
return $true
}
catch
{
if($_.Exception.Message -match "Could not load file or assembly")
{
Write-Error -Message "Unable to load the SharePoint Server 2013 Client Components.`nDownload Location: https://www.microsoft.com/en-us/download/details.aspx?id=42038"
}
else
{
Write-Error -Exception $_.Exception
}
return $false
}
}
}
function Load-SharePointOnlineModule
{
[cmdletbinding()]
param()
process
{
do
{
# Installation location: C:\Program Files\SharePoint Online Management Shell\Microsoft.Online.SharePoint.PowerShell
$spoModule = Get-Module -Name Microsoft.Online.SharePoint.PowerShell -ErrorAction SilentlyContinue
if(-not $spoModule)
{
try
{
Import-Module Microsoft.Online.SharePoint.PowerShell -DisableNameChecking
return $true
}
catch
{
if($_.Exception.Message -match "Could not load file or assembly")
{
Write-Error -Message "Unable to load the SharePoint Online Management Shell.`nDownload Location: https://www.microsoft.com/en-us/download/details.aspx?id=35588"
}
else
{
Write-Error -Exception $_.Exception
}
return $false
}
}
else
{
return $true
}
}
while(-not $spoModule)
}
}
function Remove-IrmConfiguration
{
[cmdletbinding()]
param(
[parameter(Mandatory=$true)][Microsoft.SharePoint.Client.List]$List
)
process
{
Write-Verbose "Disabling IRM Configuration on '$($List.Title)'"
$List.IrmEnabled = $false
$List.IrmExpire = $false
$List.IrmReject = $false
$List.InformationRightsManagementSettings.Reset()
}
end
{
if($List)
{
Write-Verbose "Committing IRM configuration settings on '$($list.Title)'"
$list.InformationRightsManagementSettings.Update()
$list.Update()
$script:clientContext.Load($list)
$script:clientContext.ExecuteQuery()
}
}
}
function Get-CredentialFromCredentialCache
{
[cmdletbinding()]
param([string]$CredentialName)
#if( Test-Path variable:\global:CredentialCache )
if( Get-Variable O365TenantAdminCredentialCache -Scope Global -ErrorAction SilentlyContinue )
{
if($global:O365TenantAdminCredentialCache.ContainsKey($CredentialName))
{
Write-Verbose "Credential Cache Hit: $CredentialName"
return $global:O365TenantAdminCredentialCache[$CredentialName]
}
}
Write-Verbose "Credential Cache Miss: $CredentialName"
return $null
}
function Add-CredentialToCredentialCache
{
[cmdletbinding()]
param([System.Management.Automation.PSCredential]$Credential)
if(-not (Get-Variable CredentialCache -Scope Global -ErrorAction SilentlyContinue))
{
Write-Verbose "Initializing the Credential Cache"
$global:O365TenantAdminCredentialCache = @{}
}
Write-Verbose "Adding Credential to the Credential Cache"
$global:O365TenantAdminCredentialCache[$Credential.UserName] = $Credential
}
# load the required assemblies and Windows PowerShell modules
if(-not ((Load-SharePointOnlineClientComponentAssemblies) -and (Load-SharePointOnlineModule)) ) { return }
# Add the credentials to the client context and SharePoint Online service connection
# check for cached credentials to use
$o365TenantAdminCredential = Get-CredentialFromCredentialCache -CredentialName $tenantAdmin
if(-not $o365TenantAdminCredential)
{
# when credentials are not cached, prompt for the tenant admin credentials
$o365TenantAdminCredential = Get-Credential -UserName $tenantAdmin -Message "Enter the password for the Office 365 admin"
if(-not $o365TenantAdminCredential -or -not $o365TenantAdminCredential.UserName -or $o365TenantAdminCredential.Password.Length -eq 0 )
{
Write-Error -Message "Could not validate the supplied tenant admin credentials"
return
}
# add the credentials to the cache
Add-CredentialToCredentialCache -Credential $o365TenantAdminCredential
}
# connect to Office365 first, required for SharePoint Online cmdlets to run
Connect-SPOService -Url $sharepointAdminCenterUrl -Credential $o365TenantAdminCredential
# enumerate each of the specified site URLs
foreach($webUrl in $webUrls)
{
$grantedSiteCollectionAdmin = $false
try
{
# establish the client context and set the credentials to connect to the site
$script:clientContext = New-Object Microsoft.SharePoint.Client.ClientContext($webUrl)
$script:clientContext.Credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($o365TenantAdminCredential.UserName, $o365TenantAdminCredential.Password)
# initialize the site and web context
$script:clientContext.Load($script:clientContext.Site)
$script:clientContext.Load($script:clientContext.Web)
$script:clientContext.ExecuteQuery()
# load and ensure the tenant admin user account if present on the target SharePoint site
$tenantAdminUser = $script:clientContext.Web.EnsureUser($o365TenantAdminCredential.UserName)
$script:clientContext.Load($tenantAdminUser)
$script:clientContext.ExecuteQuery()
# check if the tenant admin is a site admin
if( -not $tenantAdminUser.IsSiteAdmin )
{
try
{
# grant the tenant admin temporary admin rights to the site collection
Set-SPOUser -Site $script:clientContext.Site.Url -LoginName $o365TenantAdminCredential.UserName -IsSiteCollectionAdmin $true | Out-Null
$grantedSiteCollectionAdmin = $true
}
catch
{
Write-Error $_.Exception
return
}
}
try
{
# load the list orlibrary using CSOM
$list = $null
$list = $script:clientContext.Web.Lists.GetByTitle($listTitle)
$script:clientContext.Load($list)
$script:clientContext.ExecuteQuery()
Remove-IrmConfiguration -List $list
}
catch
{
Write-Error -Message "Error setting IRM configuration on site: $webUrl.`nError Details: $($_.Exception.ToString())"
}
}
finally
{
if($grantedSiteCollectionAdmin)
{
# remove the temporary admin rights to the site collection
Set-SPOUser -Site $script:clientContext.Site.Url -LoginName $o365TenantAdminCredential.UserName -IsSiteCollectionAdmin $false | Out-Null
}
}
}
Disconnect-SPOService -ErrorAction SilentlyContinue
```
| 48.923223 | 502 | 0.694366 | yue_Hant | 0.791522 |
0c83ba8f5c50b8bd59dda8bb70d306be6c7a55d1 | 1,661 | md | Markdown | help/home/c-get-started/c-analysis-vis/c-tables/c-add-chg-dim.md | dahlstro/data-workbench.en | 1bfea36c390d5e6aef76d5b2c9f12431571e68c8 | [
"Apache-2.0"
] | null | null | null | help/home/c-get-started/c-analysis-vis/c-tables/c-add-chg-dim.md | dahlstro/data-workbench.en | 1bfea36c390d5e6aef76d5b2c9f12431571e68c8 | [
"Apache-2.0"
] | null | null | null | help/home/c-get-started/c-analysis-vis/c-tables/c-add-chg-dim.md | dahlstro/data-workbench.en | 1bfea36c390d5e6aef76d5b2c9f12431571e68c8 | [
"Apache-2.0"
] | null | null | null | ---
description: You can add multiple dimensions to a table to attain a more detailed cross-tabulation of the data.
solution: Analytics
title: Add, change, and move a dimension
topic: Data workbench
uuid: d8e67374-3b2b-4548-9322-e83c52941331
---
# Add, change, and move a dimension{#add-change-and-move-a-dimension}
You can add multiple dimensions to a table to attain a more detailed cross-tabulation of the data.
You can add dimensions to either axis of the table.
**To add a new dimension**
* Right-click an element or the label of any dimension or the label of any metric and click **[!UICONTROL Add Dimension]** > *< **[!UICONTROL dimension name]**>.* The dimension is added to the visualization on the chosen axis.
The following example shows the Sessions metric graphed over the Day of Week dimension (in the top table) and the same table with a second dimension, This Month, added to the top axis (in the bottom table).

**To change a dimension**
* Right-click an element or the label of the dimension you want to change and click **[!UICONTROL Change Dimension]** > *< **[!UICONTROL dimension name]**>*.
**To move a dimension to another location**
You can move a dimension to a different position on the same axis or to the opposing axis.
* Right-click an element or the label of the dimension that you want to move, click **[!UICONTROL Move]***< **[!UICONTROL dimension name]**>*, then complete the appropriate step:
* To move a dimension to the opposing axis, click to top axis or to left axis.
* To swap locations with another dimension on the same axis, click **[!UICONTROL (move here)]**.
| 46.138889 | 226 | 0.743528 | eng_Latn | 0.995245 |
0c850d0c5ff38d04643dba57b66f6057c18b0b37 | 830 | md | Markdown | docs/CONTRIBUTING.md | clintval/sample_sheet | a970bc8b44de496db3e2cad9eb7972c886d27b10 | [
"MIT"
] | 42 | 2018-01-19T18:43:53.000Z | 2022-02-16T21:43:09.000Z | docs/CONTRIBUTING.md | clintval/sample_sheet | a970bc8b44de496db3e2cad9eb7972c886d27b10 | [
"MIT"
] | 71 | 2018-01-19T19:10:18.000Z | 2021-11-23T08:38:01.000Z | docs/CONTRIBUTING.md | clintval/sample_sheet | a970bc8b44de496db3e2cad9eb7972c886d27b10 | [
"MIT"
] | 10 | 2018-05-10T18:56:12.000Z | 2021-11-23T07:33:14.000Z | # How to Contribute
Pull requests, feature requests, and issues welcome!
The complete test suite is configured through `Tox`:
```bash
❯ cd sample-sheet
❯ pip install tox
❯ tox # Run entire dynamic / static analysis test suite
```
List all environments with:
```
❯ tox -av
using tox.ini: .../sample-sheet/tox.ini
using tox-3.1.2 from ../tox/__init__.py
default environments:
py36 -> run the test suite with (basepython)
py37 -> run the test suite with (basepython)
lint -> check the code style
type -> type check the library
docs -> test building of HTML docs
additional environments:
dev -> the official sample_sheet development environment
```
To run just one environment:
```bash
❯ tox -e lint
```
To pass in positional arguments to a specified environment:
```bash
❯ tox -e py36 -- -x tests/test_sample_sheet.py
```
| 20.75 | 59 | 0.726506 | eng_Latn | 0.971351 |
0c85a53deee6e2b15628c64efd50aac540388e68 | 137 | md | Markdown | README.md | owenrumney/awsdiagrams | a2c6b7a6acd728e754d8fbbc56b2eb0933fe1bf1 | [
"Unlicense"
] | 5 | 2020-08-18T13:30:32.000Z | 2021-02-13T08:22:04.000Z | README.md | owenrumney/awsdiagrams | a2c6b7a6acd728e754d8fbbc56b2eb0933fe1bf1 | [
"Unlicense"
] | null | null | null | README.md | owenrumney/awsdiagrams | a2c6b7a6acd728e754d8fbbc56b2eb0933fe1bf1 | [
"Unlicense"
] | null | null | null | # awsdiagrams
## Diagram without login
Create the diagrams for AWS you need without having to log in.
https://awsdiagrams.io/editor/
| 15.222222 | 62 | 0.759124 | eng_Latn | 0.969067 |
0c85fb876c47172c237da4fcdf122a02b9b76867 | 917 | md | Markdown | src/members/genz-meets-genyx_2021-01-15.md | SocialEntrepreneurshipNetzwerk/send | 81cf4a8756cb9d1b6674c1d54f8c137ce8e60443 | [
"MIT"
] | null | null | null | src/members/genz-meets-genyx_2021-01-15.md | SocialEntrepreneurshipNetzwerk/send | 81cf4a8756cb9d1b6674c1d54f8c137ce8e60443 | [
"MIT"
] | 3 | 2018-06-19T07:08:25.000Z | 2021-09-01T23:58:49.000Z | src/members/genz-meets-genyx_2021-01-15.md | SocialEntrepreneurshipNetzwerk/send | 81cf4a8756cb9d1b6674c1d54f8c137ce8e60443 | [
"MIT"
] | null | null | null | ---
title: GenZ meets GenYX
description: >
Die Zukunft ist jung und alt
Alle reden von Gender Diversity. Altersdiversität stellt jedoch einen ebenso integralen Bestandteil von Diversity dar. Der demografische Wandel stellt uns in Zukunft vor große gesellschaftliche und wirtschaftliche Herausforderungen. Durch die wachsenden Technologieanforderungen werden ältere Generationen oft abgehängt. Digitalisierung sollte die Kommunikation zwischen Generationen jedoch fördern anstatt sie zu behindern. Entscheidungen über die Zukunft können nur gemeinschaftlich getroffen werden, da jeder, ob jung oder alt, ein bedeutender Teil des Systems und dieser Zukunft ist.
impactArea:
- Bildung
- Demographischer Wandel
- Zukunft der Arbeit
organization: true
image: /uploads/genz-meets-genyz.png
email: [email protected]
link: https://www.genzmeetsgenyx.com/
city: München
postalCode: 80798
federalState: Bayern
---
| 48.263158 | 589 | 0.821156 | deu_Latn | 0.99042 |
0c86bd00c6dbdacf6a7a72e69fc4a81449fcd723 | 2,950 | md | Markdown | docs/vs-2015/debugger/walkthrough-debugging-at-design-time.md | viniciustavanoferreira/visualstudio-docs.pt-br | 2ec4855214a26a53888d4770ff5d6dde15dbb8a5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/vs-2015/debugger/walkthrough-debugging-at-design-time.md | viniciustavanoferreira/visualstudio-docs.pt-br | 2ec4855214a26a53888d4770ff5d6dde15dbb8a5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/vs-2015/debugger/walkthrough-debugging-at-design-time.md | viniciustavanoferreira/visualstudio-docs.pt-br | 2ec4855214a26a53888d4770ff5d6dde15dbb8a5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: 'Passo a passo: Depuração em tempo de Design | Microsoft Docs'
ms.date: 11/15/2016
ms.prod: visual-studio-dev14
ms.technology: vs-ide-debug
ms.topic: conceptual
dev_langs:
- FSharp
- VB
- CSharp
- C++
- JScript
- VB
- CSharp
- C++
helpviewer_keywords:
- debugging [Visual Studio], design-time
- breakpoints, design-time debugging
- Immediate window, design-time debugging
- design-time debugging
ms.assetid: 35bfdd2c-6f60-4be1-ba9d-55fce70ee4d8
caps.latest.revision: 23
author: MikeJo5000
ms.author: mikejo
manager: jillfra
ms.openlocfilehash: 54466cc3561c194199bbad2b35cd00433da2b0f3
ms.sourcegitcommit: 94b3a052fb1229c7e7f8804b09c1d403385c7630
ms.translationtype: MT
ms.contentlocale: pt-BR
ms.lasthandoff: 04/23/2019
ms.locfileid: "68149424"
---
# <a name="walkthrough-debugging-at-design-time"></a>Passo a passo: Depuração em tempo de design
[!INCLUDE[vs2017banner](../includes/vs2017banner.md)]
Você pode usar o Visual Studio **imediato** janela para executar uma função ou sub-rotina enquanto seu aplicativo não está em execução. Se a função ou a sub-rotina contiverem um ponto de interrupção, o Visual Studio interromperá a execução no ponto apropriado. Então, você poderá usar o depurador do Windows para examinar o estado do programa. Esse recurso é chamado de depuração em tempo de design.
O procedimento a seguir exibe como usar esse recurso.
### <a name="to-hit-breakpoints-from-the-immediate-window"></a>Para usar pontos de interrupção da janela Imediato
1. Cole o seguinte código no aplicativo de console do Visual Basic:
```
Module Module1
Sub Main()
MySub()
End Sub
Function MyFunction() As Decimal
Static i As Integer
i = i + 1
Dim s As String
s = "Add Breakpoint here"
Return 4
End Function
Sub MySub()
MyFunction()
End Sub
End Module
```
2. Defina um ponto de interrupção na linha em que se lê `s="Add BreakPoint Here"`.
3. Digite o seguinte na **imediato** janela: `?MyFunction<enter>`
4. Certifique-se de que o ponto de interrupção foi alcançado, e que a pilha de chamadas está correta.
5. Sobre o **Debug** menu, clique em **continuar**e verificar se você estiver no modo de design.
6. Digite o seguinte na **imediato** janela: `?MyFunction<enter>`
7. Digite o seguinte na **imediato** janela: `?MySub<enter>`
8. Verifique se o ponto de interrupção e examinar o valor da variável estática `i` no **Locals** janela. Deve ter o valor 3.
9. Verifique se a pilha de chamadas está correta.
10. Sobre o **Debug** menu, clique em **continuar**e verificar se você estiver no modo de design.
## <a name="see-also"></a>Consulte também
[Segurança do depurador](../debugger/debugger-security.md)
[Noções básicas do depurador](../debugger/debugger-basics.md)
| 33.908046 | 401 | 0.694915 | por_Latn | 0.987049 |
0c86ca75064019c69e03aada644b815385b0d78e | 4,538 | md | Markdown | desktop-src/Msi/using-windows-installer-with-uac.md | citelao/win32 | bf61803ccb0071d99eee158c7416b9270a83b3e4 | [
"CC-BY-4.0",
"MIT"
] | 2 | 2022-03-18T02:46:08.000Z | 2022-03-18T03:19:15.000Z | desktop-src/Msi/using-windows-installer-with-uac.md | citelao/win32 | bf61803ccb0071d99eee158c7416b9270a83b3e4 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | desktop-src/Msi/using-windows-installer-with-uac.md | citelao/win32 | bf61803ccb0071d99eee158c7416b9270a83b3e4 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
Description: Windows Installer complies with User Account Control (UAC) in Windows Vista.
ms.assetid: 13955ded-6b7f-475f-bb0f-6530a0b4963f
title: Using Windows Installer with UAC
ms.topic: article
ms.date: 05/31/2018
---
# Using Windows Installer with UAC
Windows Installer complies with [*User Account Control*](u-gly.md) (UAC) in Windows Vista. With authorization from an administrator, the Windows Installer can install applications or patches on behalf of a user that may not be a member of the Administrators group. This is referred to as an [*elevated*](e-gly.md) installation because the Windows Installer makes changes to the system on behalf of the user that would not normally be allowed if the user were making the changes directly.
- When using Windows Vista in a corporate environment, applications can be designated as managed applications. Using application deployment and [Group Policy](/previous-versions/windows/desktop/Policy/group-policy-start-page), administrators can lockdown directories and then assign or publish the managed applications in those directories to [*standard users*](s-gly.md) for install, repair, or removal. Managed applications are registered in the **HKEY\_LOCAL\_MACHINE** registry hive. Once an application has been registered as a managed application, subsequent installation operations always run with elevated privileges. If the user is running as an administrator, no prompting is required to continue the installation. If the user is running as a standard user, and the application has already been assigned or published, the installation of the managed application can continue without prompting.
- When using Windows Vista in a non-corporate environment, UAC handles the elevation of application installation. Windows Installer 4.0 can call to the [*Application Information Service*](a-gly.md) (AIS) to request administrator authorization to elevate an installation. Before an installation identified as requiring administrator privileges can be run, UAC prompts the user for consent to elevate the installation. The consent prompt is displayed by default, even if the user is a member of the local Administrators group, because administrators run as standard users until an application or system component that requires administrative credential requests permission to run. This user experience is called [*Admin Approval Mode*](a-gly.md) (AAM). If a standard user attempts to install the application, the user has to get a person with administrator privilege to provide them their administrator credentials to continue the installation. This user experience is called an [*Over the Shoulder*](o-gly.md) (OTS) credential prompt.
- Because UAC restricts privileges during the stages of an installation, developers of Windows Installer packages should not assume that their installation will always have access to all parts of the system. Windows Installer package developers should therefore adhere to the package guidelines described in [Guidelines for Packages](guidelines-for-packages.md) to ensure their package works with UAC and Windows Vista. A package that has been authored and tested to comply with UAC should contain the [**MSIDEPLOYMENTCOMPLIANT**](msideploymentcompliant.md) property set to 1.
- An administrator can also use the methods described in the section: [Installing a Package with Elevated Privileges for a Non-Admin](installing-a-package-with-elevated-privileges-for-a-non-admin.md) to enable a non-administrator user to install an application with elevated system privileges.
- Privileges are required to install an application in the per-user-managed context, and therefore subsequent Windows Installer reinstallations or repairs of the application are also performed by the installer using elevated privileges. This means that only patches from trusted sources can be applied to an application in the per-user-managed state. Beginning with Windows Installer 3.0, you can apply a patch to a per-user managed application after the patch has been registered as having elevated privileges. For information see [Patching Per-User Managed Applications](patching-per-user-managed-applications.md).
> [!Note]
> When elevated privileges are not required to install a Windows Installer package, the author of the package can suppress the dialog box that UAC displays to prompt users for administrator authorization. For more information, see [Authoring Packages without the UAC Dialog Box](authoring-packages-without-the-uac-dialog-box.md).
| 174.538462 | 1,035 | 0.813354 | eng_Latn | 0.99805 |
0c86fa6f1ceb2c9f35f4a82cc198723c353c9fb3 | 230 | md | Markdown | README.md | nickatrons/SWapp | 729c0cde5ded8d71c5cd1b5949950b859a19324c | [
"MIT"
] | null | null | null | README.md | nickatrons/SWapp | 729c0cde5ded8d71c5cd1b5949950b859a19324c | [
"MIT"
] | null | null | null | README.md | nickatrons/SWapp | 729c0cde5ded8d71c5cd1b5949950b859a19324c | [
"MIT"
] | null | null | null | # SWapp
From project root cd into app folder, where package.json file resides
Install project:
npm install
Compile for development, ignore the errors
npm run serve
Hopefully observe it run on http://localhost:8080/
| 16.428571 | 69 | 0.752174 | eng_Latn | 0.977149 |
0c874183e6ed52a6ccaa9aac1be6691462c0aead | 705 | md | Markdown | api/Word.FileConverters.Parent.md | kibitzerCZ/VBA-Docs | 046664c5f09c17707e8ee92fd1505ddd0f6c9a91 | [
"CC-BY-4.0",
"MIT"
] | 2 | 2020-03-09T13:24:12.000Z | 2020-03-09T16:19:11.000Z | api/Word.FileConverters.Parent.md | kibitzerCZ/VBA-Docs | 046664c5f09c17707e8ee92fd1505ddd0f6c9a91 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | api/Word.FileConverters.Parent.md | kibitzerCZ/VBA-Docs | 046664c5f09c17707e8ee92fd1505ddd0f6c9a91 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2019-11-28T06:51:45.000Z | 2019-11-28T06:51:45.000Z | ---
title: FileConverters.Parent property (Word)
keywords: vbawd10.chm161088490
f1_keywords:
- vbawd10.chm161088490
ms.prod: word
api_name:
- Word.FileConverters.Parent
ms.assetid: 43a746be-1733-a12d-f365-bc4b5c3db373
ms.date: 06/08/2017
localization_priority: Normal
---
# FileConverters.Parent property (Word)
Returns an **Object** that represents the parent object of the specified **FileConverters** object.
## Syntax
_expression_.**Parent**
_expression_ Required. A variable that represents a '[FileConverters](Word.fileconverters.md)' collection.
## See also
[FileConverters Collection Object](Word.fileconverters.md)
[!include[Support and feedback](~/includes/feedback-boilerplate.md)] | 22.03125 | 106 | 0.78156 | eng_Latn | 0.344892 |
0c87460b82a14c2279ca18ebebdcb1bf3c4e9b3e | 1,893 | md | Markdown | README.md | chendo/airchat | d0da3bec0d94b88bf8a990bbebbd7703ced5fc12 | [
"MIT"
] | 169 | 2016-12-11T06:02:11.000Z | 2021-11-28T21:39:36.000Z | README.md | chendo/airchat | d0da3bec0d94b88bf8a990bbebbd7703ced5fc12 | [
"MIT"
] | 1 | 2016-12-11T05:26:53.000Z | 2016-12-11T05:27:00.000Z | README.md | chendo/airchat | d0da3bec0d94b88bf8a990bbebbd7703ced5fc12 | [
"MIT"
] | 8 | 2016-12-12T16:16:16.000Z | 2019-11-23T17:31:44.000Z | # AirChat
AirChat is a zero-dependency* P2P CLI chat tool that (ab)uses the AirDrop interface to
allow chatting across WiFi networks (or no WiFi network).
A RailsCamp AU 20 project.

## Features
* Chat to other AirChat users in proximity without being on the same network
* Self-contained - no gems, nothing else to download/install
* Automatically keeps AirDrop active
* Commands: `/nick`, `/who`, `/me`, `/quit`
* User colours tied to their IPv6 address
## Requirements
* OS X with system Ruby 2+ with working AirDrop
* Ruby 2.0 or higher (comes with OS X)
* tcpdump (comes with OS X)
## Usage
```
# Get it
curl -L https://github.com/chendo/airchat/raw/master/airchat.rb > airchat.rb && chmod +x airchat.rb
# or get someone to AirDrop it to you, etc.
# AirChat requires raw access to the /dev/bpf* interface.
# Run using sudo
sudo ./airchat.rb
# OR
# Give permission to /dev/bpf*
sudo chgrp staff /dev/bpf* && sudo chmod g+rw /dev/bpf* # These permissions will reset on reboot
./airchat.rb
```
## How does it work?
AirChat uses the `awdl0` interface to talk to other machines with AirDrop active.
However, OS X restricts binding to this interface, and non-AirDrop network traffic is rejected
with `ICMP Port Unreachable`. AirChat gets around this by using `tcpdump` to receive UDP data,
as OS X doesn't stop you from sending packets through that interface.
AirChat broadcasts JSON-encoded messages in UDP to `ff02::fb` on port `1337`.
## Caveats/TODO
* Messages are transmitted in plain text.
* No direct messaging
* One channel only (you can specify a different port by modifying the source)
* Message delivery is not guaranteed
* No message repeater functionality
## License
MIT.
ANSI RGB magic sauce from the [paint gem](https://github.com/janlelis/paint).
| 30.532258 | 112 | 0.748019 | eng_Latn | 0.973042 |
0c87ccbd83da3ca4ab21a6a63edacb701d0d8e9b | 1,816 | md | Markdown | 06-PageAnatomy/Search.md | pr1mer-tech/WebGuidelines | 470f2b9e3f4a0be5c3b1044fe86512406a84d141 | [
"MIT"
] | 2 | 2021-04-15T23:41:18.000Z | 2021-04-16T01:59:26.000Z | 06-PageAnatomy/Search.md | pr1mer-tech/WebGuidelines | 470f2b9e3f4a0be5c3b1044fe86512406a84d141 | [
"MIT"
] | 16 | 2020-10-11T15:01:16.000Z | 2022-03-28T03:20:25.000Z | 06-PageAnatomy/Search.md | pr1mer-tech/WebGuidelines | 470f2b9e3f4a0be5c3b1044fe86512406a84d141 | [
"MIT"
] | null | null | null | # Search
When your website has a lot of content, it can be a good idea to have an internal search engine. A search bar allows people to search through a large collection of values by typing text into a field. A search bar can be displayed alone, or in a navigation bar. It’s always a good idea to display the search bar inside the navigation bar as it's always accessible.
**Differentiate the search bar from a simple input.** Differentiate between the two in your components. Try for example to add a magnifying glass next to the text field to make it discernible.
**Enable the Clear button.** Most search bars include a Clear button that erases the contents of the field.
**If necessary, provide hints and context in a search bar.** A search bar's field can contain placeholder text—like “Search Clothing, Shoes and Accessories” or simply “Search”—as a reminder of the context being searched. A succinct, one-line prompt with appropriate punctuation can also appear directly above a search bar to provide guidance.
**Make your search system fast and accurate.** No one wants to spend hours searching for information. If someone makes a typo or searches for a synonym, make sure that your system can still find the right result.
> We’re not asking you to become Google, but please have a decent search engine. Algolia or other front-end libraries such as Fuse or OrionSearch’s SearchKit are now handling theses things very well, so consider using them.
**Where possible, add the most obvious results when the user types.** And for more advanced search results, let the user access the full results by hiting `Enter`. For example, if a user is looking for the city of London and starts typing "lon", display "London" directly below it.
 | 129.714286 | 363 | 0.78304 | eng_Latn | 0.999524 |
0c87e7cd9b8b5ac64dddee6308bdbc5a7333eecd | 1,459 | md | Markdown | docs/cppcx/namespaces-reference-c-cx.md | urmyfaith/cpp-docs | 4a4edb34d47cb5811ddc07398bfdeb0dd4382228 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-01-29T07:51:50.000Z | 2021-01-29T07:51:50.000Z | docs/cppcx/namespaces-reference-c-cx.md | urmyfaith/cpp-docs | 4a4edb34d47cb5811ddc07398bfdeb0dd4382228 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/cppcx/namespaces-reference-c-cx.md | urmyfaith/cpp-docs | 4a4edb34d47cb5811ddc07398bfdeb0dd4382228 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2022-03-03T17:33:52.000Z | 2022-03-03T17:33:52.000Z | ---
description: "Learn more about: Namespaces Reference (C++/CX)"
title: "Namespaces Reference (C++/CX)"
ms.date: "01/22/2017"
helpviewer_keywords: ["C++/CX namespaces"]
ms.assetid: 5ebc0b49-1f22-48a7-90c4-a310bab9aba6
---
# Namespaces Reference (C++/CX)
The articles in this section of the documentation describe namespaces that support the compiler for C++/CX.
## Compiler-supplied namespaces
To simplify the coding of programs that target the Windows Runtime, the C++/CX compiler and its supporting header files provide namespaces that define a wide range of types. The namespaces define the built-in numeric types; strings, arrays, and collections; C++ exceptions that represent Windows Runtime errors; and language-specific enhancements to standard Windows Runtime types.
## Related topics
|Title|Description|
|-----------|-----------------|
|[default namespace](../cppcx/default-namespace.md)|Contains descriptions of built-in, fundamental types.|
|[Platform namespace](../cppcx/platform-namespace-c-cx.md)|Contains descriptions of types that you can use, and also internal types that are used only by the compiler infrastructure.|
|[Windows::Foundation::Collections Namespace](../cppcx/windows-foundation-collections-namespace-c-cx.md)|Contains descriptions of enhancements and extensions to the Windows Runtime`Windows::Foundation::Collections` namespace.|
## See also
[C++/CX Language Reference](../cppcx/visual-c-language-reference-c-cx.md)
| 54.037037 | 381 | 0.765593 | eng_Latn | 0.960305 |
0c87ef23c6766d2ac329a4871f70e343e23a3e6a | 664 | md | Markdown | doc/INSTALL.md | Seanjimon/stcgal | 83c0b47f621077a1457e08b8cd765a5c38328b4b | [
"MIT"
] | null | null | null | doc/INSTALL.md | Seanjimon/stcgal | 83c0b47f621077a1457e08b8cd765a5c38328b4b | [
"MIT"
] | null | null | null | doc/INSTALL.md | Seanjimon/stcgal | 83c0b47f621077a1457e08b8cd765a5c38328b4b | [
"MIT"
] | null | null | null | Installation
============
stcgal requires Python 3.2 (or later) and pySerial. USB support is
optional and requires pyusb 1.0.0b2 or later. You can run stcgal
directly with the included ```stcgal.py``` script if the dependencies
are already installed.
There are several options for permanent installation:
* Use Python3 and ```pip```. Run
```pip3 install git+https://github.com/grigorig/stcgal.git```
to install the latest version of stcgal globally on your system.
This may require administrator/root permissions for write access
to system directories.
* Use setuptools. Run ```./setup.py build``` to build and
```sudo ./setup.py install``` to install stcgal.
| 34.947368 | 69 | 0.748494 | eng_Latn | 0.994717 |
0c88c02cc88f256958be0313ab633bd129854ab1 | 610 | md | Markdown | UPDATE.md | zhangyingwei/cockroach | 5298ca2aaadff3740c0730c72da5d95db63046eb | [
"Apache-2.0"
] | 133 | 2017-09-14T03:21:36.000Z | 2022-01-02T15:26:05.000Z | UPDATE.md | Shakegun/cockroach | 5298ca2aaadff3740c0730c72da5d95db63046eb | [
"Apache-2.0"
] | 8 | 2018-02-10T13:03:51.000Z | 2018-09-13T13:08:05.000Z | UPDATE.md | Shakegun/cockroach | 5298ca2aaadff3740c0730c72da5d95db63046eb | [
"Apache-2.0"
] | 20 | 2017-12-19T19:14:00.000Z | 2021-11-19T08:08:14.000Z | # cockroach 爬虫 更新日志
*修改了task 中 get url 编译 url 的 bug
# 2018-02-28
* 增加了 queue-redis 模块
* 新年快乐
* 需改为模块
* 修改了 httpclient 实现
# 2018-02-10
* 增加了失败任务队列
* 增加了失败任务重试功能
* 调整了 task 的包位置
* task 增加了 deep 参数
* 换掉 ArrayBlockingQueue 使用可以定义优先级的 PriorityBlockingQueue,并结合 task 的 deep 参数实现任务的优先级
* 增加了response filter
* 增加了execiterslistener 监听任务开始与完成。初衷是为了有一个方法能表示出程序是否已经执行完毕,主要作用类似于:将所有爬取内容最后打包统一发邮件。
## 2018-01-19
* 修改了 response 的 close 方法到 finally 方法块中
* 增加了队列过滤器
## 2017-12-24
* 增加了 TaskResponse 中指定编码格式的接口
* 去掉了 taskqueue 中的单例
* 删除了一些无用类
* 增加了 cookie 生成器与 header 生成器
## 2017-09-13
* 增加了 log4j 为默认日志组件
* 修改了一些架构上的问题
| 17.428571 | 83 | 0.759016 | yue_Hant | 0.54034 |
0c89ee5df49c01181fab7f8d0b6ff247d6675139 | 10,402 | md | Markdown | src/pages/docs/getting-started/importing-and-exporting-data.md | seanryankeegan/postman-docs | cb5d001113296c3cd731b8893b8219d605e78dca | [
"Apache-2.0"
] | null | null | null | src/pages/docs/getting-started/importing-and-exporting-data.md | seanryankeegan/postman-docs | cb5d001113296c3cd731b8893b8219d605e78dca | [
"Apache-2.0"
] | null | null | null | src/pages/docs/getting-started/importing-and-exporting-data.md | seanryankeegan/postman-docs | cb5d001113296c3cd731b8893b8219d605e78dca | [
"Apache-2.0"
] | 1 | 2022-03-17T09:13:37.000Z | 2022-03-17T09:13:37.000Z | ---
title: 'Importing and exporting data'
order: 8.2
page_id: 'importing_and_exporting_data'
contextual_links:
- type: section
name: "Additional Resources"
- type: subtitle
name: "Case Studies"
- type: link
name: "Healthwise"
url: "https://www.postman.com/customers/healthwise.pdf"
- type: subtitle
name: "Related Blog Posts"
- type: link
name: "Sync your specs"
url: "https://blog.postman.com/sync-your-specs/"
- type: link
name: "Importing SoapUI projects into Postman"
url: "https://blog.postman.com/importing-soapui-projects-into-postman/"
- type: link
name: "Importing RAML folders into Postman"
url: "https://blog.postman.com/supporting-raml-folders-in-postman/"
- type: link
name: "Introducing Postman Collection Format Schema"
url: "https://blog.postman.com/introducing-postman-collection-format-schema/"
- type: link
name: "Travelogue of Postman Collections Format v2"
url: "https://blog.postman.com/travelogue-of-postman-collection-format-v2/"
- type: subtitle
name: "Next Steps"
- type: link
name: "Using the API Builder"
url: "/docs/designing-and-developing-your-api/the-api-workflow/"
warning: false
---
Postman can import and export Postman data, including collections, environments, data dumps, and globals. Postman can also import non-Postman data in the form of API schemas to help you consolidate your API development workflow.
## Contents
* [Importing data into Postman](#importing-data-into-postman)
* [Importing Postman data](#importing-postman-data)
* [Converting Postman collections from v1 to v2](#converting-postman-collections-from-v1-to-v2)
* [Importing API specifications](#importing-api-specifications)
* [Importing via GitHub repositories](#importing-via-github-repositories)
* [Exporting Postman data](#exporting-postman-data)
* [Exporting collections](#exporting-collections)
* [Exporting environments](#exporting-environments)
* [Exporting data dumps](#exporting-data-dumps)
* [Next steps](#next-steps)
## Importing data into Postman
You can import collections or your API specifications directly into Postman.
To import your data into Postman, click **Import** in the upper-left corner.

You can import your data from files, folders, links, raw text, or GitHub repositories.
### Importing Postman data
You can import Postman data you previously exported, including collections, environments, data dumps, and globals.
To import Postman data, click **Import**. Select your file or folder, input your link, paste your raw text, or [import from GitHub](#importing-github-repositories). Postman will automatically recognize Postman data, confirming the name, format, and what the file will import as. Click **Import** to bring your data into Postman.

#### Converting Postman collections from v1 to v2
Postman no longer supports the collection v1 format and will return an error if you attempt to import a collection in this format. You can convert your collection's format from v1 to v2 to import it into Postman.

You can take the following steps to convert the Postman collection format from v1 to v2.
In the terminal of your choice, enter the following command to install the Postman collection transformer.
```bash
sudo npm install -g postman-collection-transformer
```
You can retrieve a list of convert options by running the command with the ``-h`` flag.
```bash
postman-collection-transformer convert -h
```
Option | Details |
|:--|:--|
| `-h`, `--help` | Outputs usage information |
| `-i`, `--input <path>` | Returns a path to the input postman collection file |
| `-j`, `--input-version [version]` | Returns the version of the input collection format standard (v1 or v2) |
| `-o`, `--output <path>` | Returns a path to the output postman collection file |
| `-p`, `--output-version [version]` | Returns the version of the output collection format standard (v1 or v2) |
| `-P`, `--pretty` | Prints the output in pretty format |
| `--retain-ids` | Retains the request and folder IDs during conversion (collection ID is always retained) |
| `-w`, `--overwrite` | Overwrites the output file if it exists |
You can convert an individual Postman collection from v1 to v2 by entering the command below.
```bash
postman-collection-transformer convert -i <path to input Postman collection file> -o <path where the output Postman file will be downloaded> -j 1.0.0 -p 2.0.0 -P
```
The resulting collection will be in v2 format and downloaded to your target file path. See the [Postman Collection Transformer](https://github.com/postmanlabs/postman-collection-transformer) for more information on the collection conversion.
### Importing API specifications
Postman directly supports importing the following formats:
* [OpenAPI 3.0](https://github.com/postmanlabs/openapi-to-postman)
* Swagger [1.2](https://github.com/postmanlabs/swagger1-to-postman) and [2.0](https://github.com/postmanlabs/swagger2-postman2-lambda)
* RAML [0.8](https://github.com/postmanlabs/raml-to-postman) and [1.0](https://github.com/postmanlabs/raml1-to-postman)
* [GraphQL](https://github.com/postmanlabs/graphql-to-postman)
* WADL
* [cURL](https://github.com/postmanlabs/curl-to-postman)
There are also tools on GitHub to convert the following into a Postman collection for import:
* [Runscope](https://github.com/postmanlabs/runscope-to-postman)
* [DHC](https://github.com/postmanlabs/dhc-to-postman)
To import your API specifications into Postman, click **Import**. Select your file or folder, input your link, or paste your raw text. Confirm the name, format, and what you would like your data to import as, then click **Import** to bring your data into Postman.

> You can configure your **Import Settings**, which will differ depending on your API specification.
You can import several API specification files at once. Select the workspace you'd like to import the APIs into, choose whether you want to generate collections from the APIs, configure the details, and click **Import**.
When importing into a team workspace, you can also choose to add the APIs to the [Private API Network](/docs/collaborating-in-postman/adding-private-network/).
[](https://assets.postman.com/postman-docs/import-multiple-apis.gif)
### Importing via GitHub repositories
> You must be signed in to a [Postman account](/docs/getting-started/postman-account/#signing-up-for-a-postman-account) to use this feature.
You can import data in bulk from a GitHub repository by selecting **Import** > **Code repository** > **Connect to GitHub**.
<img alt="Import from github" src="https://assets.postman.com/postman-docs/import-from-github1.jpg"/>
Confirm your GitHub account and **Authorize postmanlabs** to access your repositories.
<img alt="Import from github" src="https://assets.postman.com/postman-docs/authorize-postman-github2.jpg" width="350px"/>
In Postman, select your GitHub organization, repository, and branch, then **Continue**.
<img alt="Select org, repo, branch" src="https://assets.postman.com/postman-docs/select-repo.jpg"/>
Confirm the files you would like to import into Postman. You can also opt to **Generate collection from imported APIs** and select what you would like to link this collection as. Click **Show advanced settings** to control how Postman should generate collections based on your file types, then select **Import**.
<img alt="Confirm github import" src="https://assets.postman.com/postman-docs/confirm-import.jpg"/>
You will receive a confirmation once the import has completed.
<img alt="Import completed" src="https://assets.postman.com/postman-docs/successful-import.jpg"/>
You can now view your newly imported files and generated collections in Postman.
<img alt="Imported data in app" src="https://assets.postman.com/postman-docs/imported-data-in-app.jpg"/>
## Exporting Postman data
You can export your Postman data, including collections, environments, data dumps, and globals, as JSON files. These files can be imported back into any Postman instance, or utilized by [Newman](/docs/running-collections/using-newman-cli/command-line-integration-with-newman/), Postman's command-line collection runner.
### Exporting collections
You can export your collections from Postman by selecting the **...** next to the collection, then **Export**.

You can then select the format you'd like your collection to export as. Click **Export** to download your newly generated JSON file.
> The export to Collection v1 format is no longer supported in Postman.

> Learn more about Postman's [collection formats](https://blog.postman.com/travelogue-of-postman-collection-format-v2/).
### Exporting environments
You can export your environments from Postman by selecting the gear icon in the upper-right corner to open **Manage Environments**. Click the download symbol next to your environment to download your newly generated JSON file.

### Exporting data dumps
You can export a data dump of all of your collections, environments, globals, and header presets in Postman. Select the gear icon in the upper-right corner to open **Settings**. Click to open the **Data** tab, then **Download** to save the newly generated JSON file of your Postman data.

## Next steps
You can collaborate on collections by [sharing](/docs/collaborating-in-postman/sharing/) and [commenting](/docs/collaborating-in-postman/commenting-on-collections/) to discuss your API projects with team members. Learn more about [Postman's API workflow](/docs/designing-and-developing-your-api/the-api-workflow/).
| 48.381395 | 328 | 0.756393 | eng_Latn | 0.927909 |
0c89f3441a021ef760fc4fa551c3d75b47225782 | 300 | md | Markdown | docs/includes/ssnotedepnextdontuse-md.md | marcustung/sql-docs.zh-tw | f64ee32984b48f6607d66d80450d51c2b2b6531d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/includes/ssnotedepnextdontuse-md.md | marcustung/sql-docs.zh-tw | f64ee32984b48f6607d66d80450d51c2b2b6531d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/includes/ssnotedepnextdontuse-md.md | marcustung/sql-docs.zh-tw | f64ee32984b48f6607d66d80450d51c2b2b6531d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
ms.openlocfilehash: d276d4df9b5b57dabe74bb91917a587fea1a6cb4
ms.sourcegitcommit: b2464064c0566590e486a3aafae6d67ce2645cef
ms.translationtype: HT
ms.contentlocale: zh-TW
ms.lasthandoff: 07/15/2019
ms.locfileid: "68160777"
---
下一版的 Microsoft SQL Server 將不再提供此功能。 請勿在新的開發工作中使用此功能,並且儘速修改使用此功能的應用程式。 | 33.333333 | 71 | 0.843333 | yue_Hant | 0.587385 |
0c8a63f863ca703616acd2cbe4d9334a4cc135ee | 785 | md | Markdown | README.md | pablang/sinatra_generator | e44c7ff75ca0544f2541a49944509e1a7ad1151c | [
"MIT"
] | 33 | 2015-01-14T00:54:35.000Z | 2022-03-07T22:06:14.000Z | README.md | pablang/sinatra_generator | e44c7ff75ca0544f2541a49944509e1a7ad1151c | [
"MIT"
] | 1 | 2019-10-14T03:44:19.000Z | 2019-10-14T03:44:19.000Z | README.md | pablang/sinatra_generator | e44c7ff75ca0544f2541a49944509e1a7ad1151c | [
"MIT"
] | 9 | 2015-08-18T13:32:29.000Z | 2021-09-07T02:55:01.000Z | # SinatraGenerator
generate a simple hello world sinatra app
## Installation
Install it yourself as:
$ gem install sinatra_generator
## Usage:
sinatra new [APP_NAME]
## Options:
-m, [--modular=MODULAR] # modular style. Inherits from Sinatra::Base
-v, [--views=VIEWS] # include views folder, index.erb and layout.erb
-a, [--assets=ASSETS] # include public, javascripts and stylesheets folder
-p, [--procfile=PROCFILE] # include Procfile
## example
sinatra new blog -mvpa
.
├── config.ru
├── Gemfile
├── main.rb
├── Procfile
├── public
│ ├── javascripts
│ │ └── application.js
│ └── stylesheets
│ └── main.css
└── views
├── index.erb
└── layout.erb
| 19.625 | 83 | 0.574522 | eng_Latn | 0.64197 |
0c8b342d1a9fc2ca9d29c1c5c0ebad3c74b3d8bd | 7,077 | md | Markdown | node_modules/node-http-server/README.md | goodguytt01/gateway-front | ddf37a50e4b7ac110f9fb82d99e75d589fdedf2f | [
"MIT"
] | 1 | 2020-10-31T09:05:09.000Z | 2020-10-31T09:05:09.000Z | gateway-front/node_modules/node-http-server/README.md | goodguytt01/Power-GR | 385266e877cf80bf68a0bd14d3884ac5659d36e1 | [
"Apache-2.0"
] | 1 | 2022-02-13T15:49:42.000Z | 2022-02-13T15:49:42.000Z | node_modules/node-http-server/README.md | goodguytt01/gateway-front | ddf37a50e4b7ac110f9fb82d99e75d589fdedf2f | [
"MIT"
] | null | null | null | Node http server
================
----
Simple to use stand alone node HTTP Server you can spin up from node apps, bash scripts, the commandline, C or python apps etc.
npm install node-http-server
[](https://npmjs.org/package/node-http-server)
----
## Defaults
---
#### currently modifiable via any interface, commandline, bash, node etc.
port : 8080
root : Current Working Directory (where you execute the command from)
domain : 0.0.0.0
index : index.html
verbose : false
noCache : true
log : false
logFunction : serverLogging
`` port `` the port on which the server should run
`` root `` the absolute location to the root dir for the public file system
`` domain `` the domain which this server applies to. You can add more servers via the node `` domains `` implementation described below than you can via bash or commandline. If you want to accept incoming requests for ***ANY Applicable Domain*** use `` 0.0.0.0 `` this will allow any request that is pointed at this machine on the specified port to use this server config.
`` index `` the default file to look for in a dir. if not found a **404** will be displayed
`` verbose `` should the server display detailed info about what it is doing
`` noCache `` should the server prevent caching
`` log `` full path to log file, if specified file is not present it will be created, however the dir must be there. ie. /tmp/server.log
`` logFunction `` this defaults to log JSON data in the `` log `` file. However, you can overwrite this and do whatever you like with the JSON data if you so choose.
---
#### currently modifiable via node
domains : {}
contentType : {
html : 'text/html',
css : 'text/css',
js : 'text/javascript',
json : 'application/json',
txt : 'text/plain',
jpeg : 'image/jpeg',
jpg : 'image/jpeg',
png : 'image/png',
gif : 'image/gif',
ico : 'image/x-icon',
appcache: 'text/cache-manifest'
}
restrictedType: {}
errors : {
headers : {
'Content-Type' : 'text/plain'
},
404: '404 MIA',
415: '415 File type not supported',
403: '403 Access Denied',
500: '500 {{err}}'
}
`` domains `` this is a mapping of hostname to path. It can be used for multiple different domains, or for subdomains.
`` contentType `` mapping of file extension to header content type.
`` restrictedType `` extensions to which external access will be denied.
`` errors `` error headers and error strings, these can be anything you like from html to text etc. just make sure they all can use the same headers. The **500** error will replace `` {{err}} `` in the specified value with the actual error message from the server.
---
## Commandline / bash use
`` launch `` is an argument that specifies to launch the server now with the provided arguments and defaults
node ~/git/node-http-server/server/http.js root=~/myApp/ port=9999 launch=now
you can specify any of the variables frpom the ***currently modifiable via any interface, commandline, bash, node etc.*** section above. The order does not matter.
node ~/git/node-http-server/server/http.js root=~/myApp/ port=8888 verbose=true launch=now
---
## node app use
var server=require('node-http-server');
`` server `` has 2 methods, `` deploy `` and `` configTemplate ``
`` server.configTemplate `` will generate a complete config file based off of the default values and arguments passed in when launching the app. **DO NOT USE launch=now** as an argument for a node app. This will result in launching 2 servers, the one you specify with the arguments passed and then the one the node app launches too.
`` server.deploy `` will accept any config params and merge them with a fresh configTemplate, so passing a modified config based off of `` server.configTemplate() `` will result in using only the values from the modified config passed when deploying as it will override all of the defaults. ***The passed config object only merges to one level deep*** so if you pass a multi level object like `` contentTypes `` it will overwrite the default config with what you sent for that object rather than merging your object with the default.
---
#### node examples
can be found in the examples folder
#### basic
this app could be launched as
`` node basicApp.js verbose=true ``
to force verbose logging. This can be helpful if you have many servers in a single app and want them all to be verbose right now for debugging or testing purposes.
var server=require('node-http-server');
console.log(server);
server.deploy(
{
port:8000,
root:'~/myApp/'
}
);
---
#### verbose
var server=require('node-http-server');
console.log(server);
server.deploy(
{
verbose:true,
port:8001,
root:'~/myApp/'
}
);
---
#### advanced
var server=require('node-http-server');
console.log(server);
var config=server.configTemplate();
config.errors['404'] = 'These are not the files you are looking for...';
config.contentType.mp4 = 'video/mp4';
config.port = 8005;
config.verbose = true;
config.root = '~/myApp/'
server.deploy(config);
---
#### multiple domains or subdomains
var server=require('node-http-server');
console.log(server);
server.deploy(
{
verbose:true,
port:8010,
root:process.env.HOME+'/myApp/',
domain:'myapp',
domains:{
'a.myapp':process.env.HOME+'/myApp/mySubdomain/',
'yourapp.com':process.env.HOME+'/www/yourApp/'
}
}
);
---
## Starting with forever
*It is helpful especially when running multiple servers to label them* with `` --uid `` for easy to remember process names
*when starting the same server many times, **like every time the system boots** you will want to append to the same log file* so use `` -a ``. Without `` -a `` forever will throw an error stating that the log file for the `` --uid `` already exists.
forever --uid nodeServer -a start ~/git/node-http-server/server/http.js root=~/myApp/ port=9999 launch=now
This can be set as a ``.profile`` command or a ``.bash_rc`` command as well if you want to launch the server every time the computer boots up.
| 39.983051 | 533 | 0.620319 | eng_Latn | 0.989599 |
0c8b992721abf72f363e49bb127bc5f54096da3a | 1,327 | md | Markdown | README.md | ommey/DevOps | 357adfbc6211255c29671a205a674825746bbbc2 | [
"MIT"
] | null | null | null | README.md | ommey/DevOps | 357adfbc6211255c29671a205a674825746bbbc2 | [
"MIT"
] | null | null | null | README.md | ommey/DevOps | 357adfbc6211255c29671a205a674825746bbbc2 | [
"MIT"
] | null | null | null | After the huge response and viewership for my earlier article https://dzone.com/articles/spring-boot-restful-web-service-example I have decided to write a new article with all the REST calls example respectively GET, POST, PUT and DELETE.
Prerequisites for this project:
1. If you have Eclipse, download the STS plug-in from here https://marketplace.eclipse.org/content/spring-tools-aka-spring-ide-and-spring-tool-suite
2. If you don’t have Eclipse, download STS from here https://spring.io/guides/gs/sts/
3. Download the latest JDK from here http://www.oracle.com/technetwork/java/javase/downloads/index.html
4. Also for testing please download and install SOAPUI tool from here https://www.soapui.org/downloads/soapui.html
The first example I am going to explain is about HTTP GET request, second example will be about HTTP POST request, third example about HTTP PUT request and fourth example is for HTTP DELETE request. In these entire examples I am going to use JSON Representation.
Before checkout this project create a folder under C drive like C:\Projects
Now open command prompt,
1. cd c:\Projects
2. check out the mail branch
3. cd spring-boot-rest-2
4. Execute - mvnw clean package
5. start the server - java -jar target\spring-boot-rest-2-0.0.1-SNAPSHOT.jar
Hello User yes you are there, my name is ok
i
| 51.038462 | 262 | 0.785983 | eng_Latn | 0.972544 |
0c8bad1d71e93e6c213462d37a467d39092086ea | 1,254 | md | Markdown | README.md | mnguyen0226/ai-assurance-research | 1ad522f14f14eb77b01be9dcd6a42f847b3a9738 | [
"MIT"
] | null | null | null | README.md | mnguyen0226/ai-assurance-research | 1ad522f14f14eb77b01be9dcd6a42f847b3a9738 | [
"MIT"
] | null | null | null | README.md | mnguyen0226/ai-assurance-research | 1ad522f14f14eb77b01be9dcd6a42f847b3a9738 | [
"MIT"
] | null | null | null | # AI Assurance Research - Future Scientists
## ***** Spring 2021 *****
### [Correlation Analysis](https://github.com/mnguyen0226/ai-assurance-research/tree/main/correlation) for Technology Metrics, Environmental Descriptors, Technology Laws (preprocessed datasets included).
### [Causality Analysis](https://github.com/mnguyen0226/ai-assurance-research/tree/main/causation/DoWhy%20Analysis%20Example) for Technology Metrics, Environmental Descriptors, Technology Laws (preprocessed datasets included).
## ***** Summer 2021 *****
### [Pytorch Review Note](https://github.com/mnguyen0226/ai-assurance-research/blob/main/msa/pytorch_review/basic_concepts.docx)
### [Fashion MNIST CNN](https://github.com/mnguyen0226/ai-assurance-research/blob/main/msa/pytorch_review/cnn_fashion_mini_mnist.ipynb)
### [Fashion MNIST Hyperparameter Testing + Tensorboard](https://github.com/mnguyen0226/ai-assurance-research/blob/main/msa/pytorch_review/cnn_fashion_mini_mnist.ipynb)
### [Fashion MNIST Sequential Models](https://github.com/mnguyen0226/ai-assurance-research/blob/main/msa/pytorch_review/sequential_class.ipynb)
### [Fashion MNIST Batch Normalization](https://github.com/mnguyen0226/ai-assurance-research/blob/main/msa/pytorch_review/batch_norm.ipynb)
| 69.666667 | 226 | 0.795853 | yue_Hant | 0.209646 |
0c8c614ac6d7cefd3d1c006e2599c8b6e317767e | 1,214 | md | Markdown | windows.networking/hostname_ipinformation.md | gbaychev/winrt-api | 25346cd51bc9d24c8c4371dc59768e039eaf02f1 | [
"CC-BY-4.0",
"MIT"
] | 199 | 2017-02-09T23:13:51.000Z | 2022-03-28T15:56:12.000Z | windows.networking/hostname_ipinformation.md | gbaychev/winrt-api | 25346cd51bc9d24c8c4371dc59768e039eaf02f1 | [
"CC-BY-4.0",
"MIT"
] | 2,093 | 2017-02-09T21:52:45.000Z | 2022-03-25T22:23:18.000Z | windows.networking/hostname_ipinformation.md | gbaychev/winrt-api | 25346cd51bc9d24c8c4371dc59768e039eaf02f1 | [
"CC-BY-4.0",
"MIT"
] | 620 | 2017-02-08T19:19:44.000Z | 2022-03-29T11:38:25.000Z | ---
-api-id: P:Windows.Networking.HostName.IPInformation
-api-type: winrt property
---
<!-- Property syntax
public Windows.Networking.Connectivity.IPInformation IPInformation { get; }
-->
# Windows.Networking.HostName.IPInformation
## -description
Gets the [IPInformation](../windows.networking.connectivity/ipinformation.md) object for a local IP address assigned to a [HostName](hostname.md) object.
## -property-value
The [IPInformation](../windows.networking.connectivity/ipinformation.md) object for the IP address.
## -remarks
The [IPInformation](../windows.networking.connectivity/ipinformation.md) property is only set when the [HostName](hostname.md) object is a local IPv4 or IPv6 address returned by the [GetHostNames](../windows.networking.connectivity/networkinformation_gethostnames_136280557.md) method. This property represents the [IPInformation](../windows.networking.connectivity/ipinformation.md) object for the local IP address. If the [HostName](hostname.md) object is not a local IPv4 or IPv6 address, this property will be null.
## -examples
## -see-also
[DatagramSocket](../windows.networking.sockets/datagramsocket.md), [StreamSocket](../windows.networking.sockets/streamsocket.md) | 50.583333 | 519 | 0.788303 | eng_Latn | 0.76533 |
0c8d14a69ac233c38c0edea7f8f37853e0502587 | 4,109 | md | Markdown | logya/sites/docs/content/documentstructure.md | yaph/logya | 9647f58a0b8653b56ad64332e235a76cab3acda9 | [
"MIT"
] | 12 | 2015-03-04T03:23:56.000Z | 2020-11-17T08:09:17.000Z | logya/sites/docs/content/documentstructure.md | elaOnMars/logya | a9f256ac8840e21b348ac842b35683224e25b613 | [
"MIT"
] | 78 | 2015-01-05T11:40:41.000Z | 2022-01-23T21:05:39.000Z | logya/sites/docs/content/documentstructure.md | elaOnMars/logya | a9f256ac8840e21b348ac842b35683224e25b613 | [
"MIT"
] | 6 | 2015-04-20T06:58:42.000Z | 2022-01-31T00:36:29.000Z | ---
page: 4
title: Document Structure
template: page.html
created: 2013-09-08 19:45:45
---
Documents are dived into header and body parts.
[TOC]
## Document Header
---
title: Logya Documentation
template: page.html
created: 2012-03-18 13:59:16
image: /path/to/image.png
---
The header is in [YAML](https://yaml.org/) format. It starts and ends with 3 dashes. The header is used for setting document attributes, that can be of arbitrary complexity and are accessible in templates. You can set a meta description, the scripts and stylesheets to include and whatever is useful in your scenario. The following attributes are reserved to have special meanings.
### Special Attributes
#### body
Don't set a `body` attribute in the document header. It will be set to the body of the document automatically. An existing value will be overwritten.
#### template
This is the only required attribute. Set it to the Jinja2 template to use for rendering the document. The template file name has to be specified relative to the `templates` directory.
#### title
Setting a document `title` is highly recommended, but if not present the stem part of the content file name will be set as the document title.
#### url
You can manually set a `url` attribute, which must be unique and can be used to refer to a document in templates. If you omit the `url` it will be created from the file name, e. g. the document in `content/book/chapter-1.html` will get the URL `/book/chapter-1/` with the file extension removed. File extensions are removed from HTML and Markdown files.
#### created
If you specify a `created` datetime, you must use the format `YYYY-MM-DD HH:MM:SS` as shown in the example. Otherwise it will be set to the file modification time. I recommend setting this manually to the date of first publication. When you call the `get_docs` function in templates, by default documents will be sorted by `created` in descending order, so newest documents show up first. This is also the order in which documents appear in automatically created collection pages.
#### updated
The `updated` datetime works like `created` and should show when the document was last edited. This can be useful if you want to highlight an edit, but typically the default value is fine.
#### pre_render
Use `pre_render` to enable the use of Jinja template syntax and documents. A sample use case would be for creating absolute URLs for internal links using the `base_url` setting.
<a href="{{ base_url }}/path/to/page/">Link text</a>
For this to work you have to set `pre_render` to a list of attribute names. To pre-render the `body` add the following line to the document header.
pre_render: [body]
## Collections
You can create document collections using content attributes and corresponding settings in `site.yaml`. An example is to categorize content using a `tags` attribute.
tags: [source code, python, programming]
For each value in `tags` a collection is created, that contains all documents where value appears in the `tags` attribute. For each tag value a page is automatically created where you can show the documents in that collection. The URL of the collection page is created from the collection `path` value in `site.yaml` and the value, e. g. the `source code` collection URL could be `/tag/source-code/`.
For each collection in a document an additional template variable will be available named from the collection name and the suffix `_links`, e. g. `tags_links`. This can be used in templates to create a list of links to the collection pages on a content page.
<ul>
{% for url, anchor in tags_links %}
<li><a href="{{ url }}">{{ anchor }}</a></li>
{% endfor %}
</ul>
Only use letters and underscores in the names of collections and set the document attribute to a list of string values.
## Document Body
The part after the second `---` separator is the document `body`. Text written in [Markdown](https://daringfireball.net/projects/markdown/) will be converted to HTML, if the corresponding file name ends with `.md` or `.markdown`. | 52.679487 | 480 | 0.749817 | eng_Latn | 0.999145 |
0c8d159eb59a8e78d901ea5f09be62e815ddcdd5 | 648 | md | Markdown | README.md | shatalovdm/feedreader | ee8cfaa429f41a3a6e38cb389bb62e2f68d209fa | [
"MIT"
] | null | null | null | README.md | shatalovdm/feedreader | ee8cfaa429f41a3a6e38cb389bb62e2f68d209fa | [
"MIT"
] | null | null | null | README.md | shatalovdm/feedreader | ee8cfaa429f41a3a6e38cb389bb62e2f68d209fa | [
"MIT"
] | null | null | null | # Feedreader
This is a project craeted to practice Jasmine testing framework. All test suites can be found in `jasmine/spec/feedreader.js`.
## Getting started
### Installation
1. Clone or download the repository:
```
$ git clone https://github.com/shatalovdm/feedreader
```
2. Go to the root of the project and run Python HTTP server specifying the port number:
```
$ python -m SimpleHTTPServer 8080
```
3. Open `localhost:8080` in your browser.
### Demo
The demo version is available for use [here](http://dshatalov.com/feedreader/).
## License
This project is released under the [MIT License](https://opensource.org/licenses/MIT).
| 22.344828 | 126 | 0.734568 | eng_Latn | 0.96337 |
0c8e4a1f291108ce90be48cef98e179ef636fe34 | 549 | md | Markdown | README.md | clamsproject/mmif-python | e1f1c4371df5b0a3f89458e810d8b3a024c7d985 | [
"Apache-2.0"
] | 1 | 2021-01-17T19:47:50.000Z | 2021-01-17T19:47:50.000Z | README.md | clamsproject/mmif-python | e1f1c4371df5b0a3f89458e810d8b3a024c7d985 | [
"Apache-2.0"
] | 179 | 2020-11-24T00:23:53.000Z | 2022-03-31T14:57:49.000Z | README.md | clamsproject/mmif-python | e1f1c4371df5b0a3f89458e810d8b3a024c7d985 | [
"Apache-2.0"
] | null | null | null | # MMIF for python
**NOTE** that this project is in pre-alpha and being actively developed. Nothing is guaranteed to reliably work for the moment and developer need to be very careful when using APIs implemented here. Please use [the issue track](https://github.com/clamsproject/mmif/issues) to report bugs and malfunctions.
## MultiMedia Interchange Format
[MMIF](https://mmif.clams.ai) is a JSON(-LD)-based data format designed for transferring annotation data between computational analysis applications in [CLAMS project](https://clams.ai).
| 68.625 | 305 | 0.786885 | eng_Latn | 0.994015 |
0c8e99d6f48cb0b41155b6f64fe05a0faf3cff63 | 15,432 | md | Markdown | _posts/2018-06-13-Apache-HttpClient.md | clibing/clibing.github.io | edcc041716b971e46f651feed70211c8e51d7ded | [
"MIT"
] | 2 | 2018-07-10T11:05:32.000Z | 2019-09-19T11:30:37.000Z | _posts/2018-06-13-Apache-HttpClient.md | clibing/clibing.github.io | edcc041716b971e46f651feed70211c8e51d7ded | [
"MIT"
] | null | null | null | _posts/2018-06-13-Apache-HttpClient.md | clibing/clibing.github.io | edcc041716b971e46f651feed70211c8e51d7ded | [
"MIT"
] | 1 | 2019-09-19T11:30:40.000Z | 2019-09-19T11:30:40.000Z | ---
layout: post
title: HttpClient 工具类
categories: [Java, HTTP]
description: HttpClient 工具类
keywords: java,http
---
#### HttpClient 工具类
```java
package cn.linuxcrypt.utils;
import org.apache.http.*;
import org.apache.http.client.ClientProtocolException;
import org.apache.http.client.ResponseHandler;
import org.apache.http.client.config.CookieSpecs;
import org.apache.http.client.config.RequestConfig;
import org.apache.http.client.entity.UrlEncodedFormEntity;
import org.apache.http.client.methods.CloseableHttpResponse;
import org.apache.http.client.methods.HttpGet;
import org.apache.http.client.methods.HttpPost;
import org.apache.http.client.methods.HttpRequestBase;
import org.apache.http.client.protocol.HttpClientContext;
import org.apache.http.config.ConnectionConfig;
import org.apache.http.config.Registry;
import org.apache.http.config.RegistryBuilder;
import org.apache.http.conn.ConnectionKeepAliveStrategy;
import org.apache.http.conn.socket.ConnectionSocketFactory;
import org.apache.http.conn.socket.PlainConnectionSocketFactory;
import org.apache.http.conn.ssl.SSLConnectionSocketFactory;
import org.apache.http.entity.ContentType;
import org.apache.http.impl.client.CloseableHttpClient;
import org.apache.http.impl.client.DefaultHttpRequestRetryHandler;
import org.apache.http.impl.conn.PoolingHttpClientConnectionManager;
import org.apache.http.message.BasicHeader;
import org.apache.http.message.BasicHeaderElementIterator;
import org.apache.http.message.BasicNameValuePair;
import org.apache.http.protocol.HTTP;
import org.apache.http.protocol.HttpContext;
import org.apache.http.ssl.SSLContextBuilder;
import org.apache.http.util.Args;
import org.apache.http.util.EntityUtils;
import javax.net.ssl.SSLContext;
import javax.net.ssl.TrustManager;
import javax.net.ssl.X509TrustManager;
import java.io.IOException;
import java.nio.charset.Charset;
import java.nio.charset.CodingErrorAction;
import java.security.cert.X509Certificate;
import java.util.*;
public final class HttpClients {
// 设置整个连接池最大连接数
private static int POOL_MAX_TOTAL = 2;
/**
* 设置整个连接池最大连接数
*
* @param maxTotal
*/
public static void setPoolMaxTotal(int maxTotal) {
synchronized (HttpClients.class) {
POOL_MAX_TOTAL = maxTotal;
HttpClientPool.setMaxTotal(maxTotal);
}
}
/**
* http 连接池
*/
static class HttpClientPool {
private static PoolingHttpClientConnectionManager poolingHttpClientConnectionManager = null;
// 设置每个路由上的默认连接个数,setMaxPerRoute则单独为某个站点设置最大连接个数。
private static final int POOL_MAX_PER_ROUTER = 1;
private static final String DEFAULT_USER_AGENT = "Mozilla/5.0 (Windows NT 6.1; WOW64; rv:60.0) Gecko/20100101 Firefox/60.0";
// keepalive
private static int DEFAULT_KEEP_ALIVE = 30 * 1000;
/**
* 从连接池中获取请求连接的超时时间 单位毫秒
* -1: 系统默认的超时时间,内核级配置
* 0: 无限制。
* 具体参考{@link org.apache.http.client.config.RequestConfig#getConnectionRequestTimeout()}
*/
public static final int DEFAULT_CONNECTION_REQUEST_TIMEOUT = -1;
// 默认连接超时时间
public static final int DEFAULT_CONNECT_TIMEOUT = 10000;
// 默认socket读取数据超时时间,具体的长耗时请求中(如文件传送等)必须覆盖此设置
public static final int DEFAULT_SO_TIMEOUT = 15000;
static {
Registry<ConnectionSocketFactory> socketFactoryRegistry = null;
try {
final SSLContext sslContext = SSLContextBuilder.create().build();
sslContext.init(null, new TrustManager[]{
new X509TrustManager() {
public X509Certificate[] getAcceptedIssuers() {
return null;
}
public void checkClientTrusted(X509Certificate[] certs, String authType) {
}
public void checkServerTrusted(X509Certificate[] certs, String authType) {
}
}
}, null);
socketFactoryRegistry = RegistryBuilder
.<ConnectionSocketFactory>create()
.register("http", PlainConnectionSocketFactory.INSTANCE)
.register("https", new SSLConnectionSocketFactory(sslContext)).build();
} catch (Exception e) {
}
poolingHttpClientConnectionManager = new PoolingHttpClientConnectionManager(socketFactoryRegistry);
//连接池的最大连接数
poolingHttpClientConnectionManager.setMaxTotal(POOL_MAX_TOTAL);
/**
* 设置每个路由上的默认连接个数,setMaxPerRoute则单独为某个站点设置最大连接个数。
*
* DefaultMaxPerRoute是根据连接到的主机对MaxTotal的一个细分;比如:
* MaxtTotal=400 DefaultMaxPerRoute=200
* 而我只连接到http://a.com时,到这个主机的并发最多只有200;而不是400;
* 而我连接到http://a.com 和 http://b.com时,到每个主机的并发最多只有200;即加起来是400(但不能超过400;所以起作用的设置是DefaultMaxPerRoute。
*/
poolingHttpClientConnectionManager.setDefaultMaxPerRoute(POOL_MAX_PER_ROUTER);
// 默认连接配置
ConnectionConfig connectionConfig = ConnectionConfig.custom()
.setMalformedInputAction(CodingErrorAction.IGNORE)
.setUnmappableInputAction(CodingErrorAction.IGNORE)
.setCharset(Consts.UTF_8)
.build();
poolingHttpClientConnectionManager.setDefaultConnectionConfig(connectionConfig);
}
public static void setMaxTotal(int maxTotal) {
poolingHttpClientConnectionManager.setMaxTotal(maxTotal);
}
/**
* 增加默认的http 头
*
* @return
* @{link https://www.cnblogs.com/lwhkdash/archive/2012/10/14/2723252.html}
*/
private static Set<Header> defaultHeaders() {
Set<Header> header = new HashSet<>();
Header accept = new BasicHeader(HttpHeaders.ACCEPT,
"text/html,application/xhtml+xml,application/json,application/xml;q=0.9,*/*;q=0.8");
header.add(accept);
Header acceptEncoding = new BasicHeader(HttpHeaders.ACCEPT_ENCODING, "gzip, deflate");
header.add(acceptEncoding);
Header acceptLanguage = new BasicHeader(HttpHeaders.ACCEPT_LANGUAGE, "zh-CN,zh;q=0.8,en-US;q=0.5,en;q=0.3");
header.add(acceptLanguage);
Header connect = new BasicHeader(HttpHeaders.CONNECTION, "keep-alive");
header.add(connect);
Header acceptCharset = new BasicHeader(HttpHeaders.ACCEPT_CHARSET, Consts.UTF_8.name());
header.add(acceptCharset);
// DO NOT TRACK的缩写,要求服务器程序不要跟踪记录用户信息。DNT: 1 (开启DNT) DNT: 0 (关闭DNT)火狐,safari,IE9都支持这个头域,并且于2011年3月7日被提交至IETF组织实现标准化
Header dnt = new BasicHeader("DNT", "1");
header.add(dnt);
return header;
}
/**
* 获取 HttpClient
* @return
*/
public static CloseableHttpClient getHttpClient() {
return getHttpClient(DEFAULT_SO_TIMEOUT, DEFAULT_CONNECT_TIMEOUT, 0);
}
/**
* 默认keepAlive策略:如果响应中存在服务器端的keepAlive超时时间则返回该时间否则返回默认的
*/
public static class DefaultConnectionKeepAliveStrategy implements ConnectionKeepAliveStrategy {
public long getKeepAliveDuration(HttpResponse response, HttpContext context) {
HeaderElementIterator it = new BasicHeaderElementIterator(response.headerIterator(HTTP.CONN_KEEP_ALIVE));
while (it.hasNext()) {
HeaderElement he = it.nextElement();
String param = he.getName();
String value = he.getValue();
if (value != null && param.equalsIgnoreCase("timeout")) {
try {
return Long.parseLong(value) * 1000;
} catch (NumberFormatException ignore) {
}
}
}
return DEFAULT_KEEP_ALIVE; //默认30秒
}
}
public static CloseableHttpClient getHttpClient(int socketTimeout, int connectTimeout, int retryCount) {
RequestConfig globalConfig = RequestConfig.custom()
.setCookieSpec(CookieSpecs.IGNORE_COOKIES)
.setSocketTimeout(socketTimeout)
.setConnectionRequestTimeout(DEFAULT_CONNECTION_REQUEST_TIMEOUT)
.setConnectTimeout(connectTimeout)
.build();
CloseableHttpClient closeableHttpClient = org.apache.http.impl.client.HttpClients
.custom()
.setConnectionManager(poolingHttpClientConnectionManager)
.setKeepAliveStrategy(new DefaultConnectionKeepAliveStrategy())
// 另外设置http client的重试次数,默认是3次;当前是禁用掉(如果项目量不到,这个默认即可)
.setRetryHandler(new DefaultHttpRequestRetryHandler(retryCount, false))
.setUserAgent(DEFAULT_USER_AGENT)
.setDefaultHeaders(defaultHeaders())
.setDefaultRequestConfig(globalConfig)
.setConnectionManagerShared(true)
.evictExpiredConnections()// 开启超时清理线程
.build();
return closeableHttpClient;
}
}
/**
* 对于特殊请求(比如请求涉及到cookie的处理,鉴权认证等),默认的一些配置已经满足不了了,
* 这时就可以使用一个独立于全局的配置来执行请求,这个独立于全局,又不会干扰其他线程的请求执行的机制就是使用HttpClientContext,
* 该设置类用于对已经提供的一个基于全局配置的副本,来设置一些配置(见HttpClientContext.setXxx)
*/
public static interface HttpClientContextSetter {
public void setHttpClientContext(HttpClientContext context);
}
/**
* <p>执行http请求</p>
*
* @param httpMethod - HTTP请求(HttpGet、HttpPost等等)
* @param httpClientContextSetter - 可选参数,请求前的一些参数设置(如:cookie、鉴权认证等)
* @param responseHandler - 必选参数,响应处理类(如针对httpstatu的各种值做一些策略处理等等)
* @return 推荐使用 org.apache.http.impl.client.CloseableHttpClient#execute( org.apache.http.HttpHost,
* org.apache.http.HttpRequest,
* org.apache.http.client.ResponseHandler,
* org.apache.http.protocol.HttpContext)
*/
public static <T> T doHttpRequest(HttpRequestBase httpMethod, HttpClientContextSetter httpClientContextSetter, ResponseHandler<T> responseHandler) {
Args.notNull(httpMethod, "Parameter 'httpMethod' can not be null!");
Args.notNull(responseHandler, "Parameter 'responseHandler' can not be null!");
CloseableHttpResponse response = null;
try {
if (httpClientContextSetter != null) {
HttpClientContext context = HttpClientContext.create();
httpClientContextSetter.setHttpClientContext(context);
response = HttpClientPool.getHttpClient().execute(httpMethod, context);
} else {
response = HttpClientPool.getHttpClient().execute(httpMethod);
}
return response == null ? null : responseHandler.handleResponse(response);
} catch (Exception e) {
throw new RuntimeException(e.getMessage(), e);
} finally {
if (response != null) {
try {
response.close();
} catch (IOException e) {
}
}
}
}
/**
* 默认的处理返回值为String的ResponseHandler
*/
public static class DefaultStringResponseHandler implements ResponseHandler<String> {
/**
* 默认响应html字符集编码
*/
private Charset defaultCharset = Consts.UTF_8;
public DefaultStringResponseHandler() {
super();
}
public DefaultStringResponseHandler(String defaultCharset) {
super();
this.defaultCharset = Charset.forName(defaultCharset);
}
public Charset getDefaultCharset() {
return defaultCharset;
}
public void setDefaultCharset(Charset defaultCharset) {
this.defaultCharset = defaultCharset;
}
public String handleResponse(HttpResponse response) throws ClientProtocolException, IOException {
HttpEntity httpEntity = response.getEntity();
if (httpEntity != null) {
return EntityUtils.toString(httpEntity, defaultCharset == null ? ContentType.getOrDefault(httpEntity).getCharset() : defaultCharset);
}
return null;
}
}
/**
* <p>根据URL和参数创建HttpPost对象</p>
*
* @param url
* @param paramMap
* @return
*/
public static HttpPost createHttpPost(String url, Map<String, String> paramMap) {
try {
HttpPost httpPost = new HttpPost(url);
if (paramMap != null && !paramMap.isEmpty()) {
List<NameValuePair> params = new ArrayList<NameValuePair>();
for (Map.Entry<String, String> entry : paramMap.entrySet()) {
params.add(new BasicNameValuePair(entry.getKey(), entry.getValue()));
}
UrlEncodedFormEntity formEntity = new UrlEncodedFormEntity(params, Consts.UTF_8.name());
httpPost.setEntity(formEntity);
}
return httpPost;
} catch (Exception e) {
throw new RuntimeException(e.getMessage(), e);
}
}
public static String get(String url) {
HttpGet httpGet = new HttpGet(url);
String value = doHttpRequest(httpGet, null, new DefaultStringResponseHandler());
return value;
}
public static String post(String url, Map<String, String> param){
HttpPost post = createHttpPost(url, param);
return doHttpRequest(post, null, new DefaultStringResponseHandler());
}
}
```
#### 总结
1. `DefaultMaxPerRoute`和`MaxTotal`配置
DefaultMaxPerRoute是根据连接到的主机对MaxTotal的一个细分;比如:
MaxtTotal=400 DefaultMaxPerRoute=200
而我只连接到http://a.com时,到这个主机的并发最多只有200;而不是400;
而我连接到http://a.com 和 http://b.com时,到每个主机的并发最多只有200;即加起来是400(但不能超过400;所以起作用的设置是DefaultMaxPerRoute。
2. 超时设置
* connectionRequestTimeout: 从连接池中获取请求连接的超时时间 单位毫秒, -1:系统默认的超时时间,内核级配置; 0:无限制
* connectTimeout: 默认连接超时时间
* soTimeout: 默认socket读取数据超时时间,具体的长耗时请求中(如文件传送等)必须覆盖此设置
3. 策略
* pool.evictExpiredConnections(true): 后台启动一个线程,进行超时连接处理
* 当获取可用连接时,采用LRU进行处理连接。
4. 池中池
* httpclient在初始化时,设置的MaxTotal的参数为总的连接池
* 在最大的池中,根据主机的名字(route)进行小池的划分
* 在动态获取可用连接的时候采用`LRU`算法,清理或者释放连接
5. available集合和leased集合
* 见 org.apache.http.pool.AbstractConnPool
* leased集合当前租用的连接
* available集合当前可用的连接
* LRU会根据MaxTotal、leased集合总数、available集合总数进行LRU淘汰。
6. Future<CPoolEntry> 进行一步接收数据
7. 持久连接
* HTTP/1.1采取持久连接的方式替代了Keep-Alive
* HTTP/1.1的连接默认情况下都是持久连接。如果要显式关闭,需要在报文中加上Connection:Close首部。即在HTTP/1.1中,所有的连接都进行了复用
* 两种方式,空闲的持久连接也可以随时被客户端与服务端关闭。不发送Connection:Close不意味着服务器承诺连接永远保持打开。
8. EntityUtils.toString(HttpEntity, ...)和EntityUtils.consume(HttpEntity);
* 注意此方法会关闭InputStream,因为是多路复用,每次读取完必须关闭,否则不能被复用
* CloseableHttpResponse在每次请求完进行reponse.close()
9. HttpClient不能关闭,否则连接都会重新建立,会发起tcp的3次握手连接和4次握手断开
#### 测试
抓包测试池化后网络连接3次握手建立连接和4次握手断开连接(抓包只看到3次)

#### 参考
* [参考1](https://blog.csdn.net/undergrowth/article/details/77341760)
* [参考2](https://blog.csdn.net/undergrowth/article/details/77203668)
* [参考3](http://www.cnblogs.com/kingszelda/p/8988505.html)
| 39.670951 | 152 | 0.649235 | yue_Hant | 0.610204 |
0c8f1ff2cb19487f4c8755d350f71c8cb3c18dbf | 1,590 | md | Markdown | _posts/17/2021-04-07-devorah-roloff.md | chito365/ukdat | 382c0628a4a8bed0f504f6414496281daf78f2d8 | [
"MIT"
] | null | null | null | _posts/17/2021-04-07-devorah-roloff.md | chito365/ukdat | 382c0628a4a8bed0f504f6414496281daf78f2d8 | [
"MIT"
] | null | null | null | _posts/17/2021-04-07-devorah-roloff.md | chito365/ukdat | 382c0628a4a8bed0f504f6414496281daf78f2d8 | [
"MIT"
] | null | null | null | ---
id: 7645
title: Devorah Roloff
date: 2021-04-07T04:13:26+00:00
author: victor
layout: post
guid: https://ukdataservers.com/devorah-roloff/
permalink: /04/07/devorah-roloff
tags:
- claims
- lawyer
- doctor
- house
- multi family
- online
- poll
- business
- unspecified
- single
- relationship
- engaged
- married
- complicated
- open relationship
- widowed
- separated
- divorced
- Husband
- Wife
- Boyfriend
- Girlfriend
category: Guides
---
* some text
{: toc}
## Who is Devorah Roloff
Popular content creator who is best known for the vlog-style content she posts to her self-titled YouTube channel. Her lifestyle and challenge videos have earned her over 80,000 subscribers on the video platform.
## Prior to Popularity
She launched her YouTube channel with a Q&A video in June 2020.
## Random data
She has accumulated over 110,000 followers on her devorahlazar Instagram account, where she posts a variety of different fashion looks.
## Family & Everyday Life of Devorah Roloff
She was born and raised in the United States. Her mom appeared in her “come to Florida with me & my mom” YouTube video.
## People Related With Devorah Roloff
She has collaborated with her best friend Claudia Walsh on her YouTube channel.
| 18.488372 | 213 | 0.596855 | eng_Latn | 0.996852 |
0c90051f2bdb0e46f4450f53547360e198a4203e | 2,453 | md | Markdown | README.md | rforbiodatascience22/group_2_package | e69553e5d65d1fe09262ce872c8aecfd075e8b84 | [
"MIT"
] | null | null | null | README.md | rforbiodatascience22/group_2_package | e69553e5d65d1fe09262ce872c8aecfd075e8b84 | [
"MIT"
] | null | null | null | README.md | rforbiodatascience22/group_2_package | e69553e5d65d1fe09262ce872c8aecfd075e8b84 | [
"MIT"
] | null | null | null |
<!-- README.md is generated from README.Rmd. Please edit that file -->
### GitHub repository
The URL is : <https://github.com/rforbiodatascience22/group_2_package>
***
**Package description**
The purpose of this package is to simulate the **Central Dogma of
Molecular Biology**. To show the flow of information from DNA > RNA > Protein
***
Package usage (with examples)
**Codon Table**
A codon table is necessary for the `translation` function The codon
table in this package is sourced from NCBI, stored in the object
`codon_table`.
### Function 1: `sample_with_replacement`
`sample_with_replacement` creates a DNA sequence, based on random
sampling with replacement. The DNA sequence length is specified with the
argument `nucleotide_size`. **Example**:
dna <- sample_with_replacement(nucleotide_size = 50)
Generates an object `dna` with a `nucleotide_size` 50
***
### Function 2: `TU_sub`
`TU_sub` simulates the transcription process. The function replaces the
base T (thymine) with U (uracil). This replacement mimics transcription,
and effectively creates an translatable RNA strand. **Example**:
rna <- TU_sub(dna = dna)
Generates an object `rna` based on the output from
`sample_with_replacement`.
***
### Function 3: `codon_start`
`codon_start` splits the `rna` object generated with `TU_sub` into
codons. **Example**:
codons <- codon_start(rna = rna)
Generates `codons` based on output from `TU_sub`.
***
### Function 4: `translation`
`translation` well… translates. This function takes an object that has
been neatly divided into codons, and uses `codon_table` to generate a
polypeptide chain based on the output from `codon_start`. **Example**:
protein <- translation(codons = codons)
Generates `protein` based on output from `TU_sub`.
***
### Function 5: `aa_plot`
`aa_plot` uses `ggplot2` to plot the counts of each amino acid residue
in the generated polypeptide chain. **Example**:
aa_plot(peptide = protein)
Generated plot is based on the output from `translation`.
***
**Simulating mutations**
The package assumes flawless transcription and translation from start to
finish. However, in order simulate point mutations, an intermediary step
could be added between `sample_with_replacement` and `TU_sub`, to
replace bases with a “-”. Similarly, either preceding or following
`TU_sub`, a step that truncated or added 1 or 2 bases could be added, to
simulate a reading frame shift.
| 26.376344 | 77 | 0.746841 | eng_Latn | 0.994887 |
0c9042545ab5ba353413fc956df4b11c74267e0f | 1,459 | md | Markdown | source/_posts/2019-03-03-nlp-task1.md | MixLabPro/2019 | 3f09e8a2a763c429f8898125426dff395c4519a8 | [
"MIT"
] | 1 | 2019-03-30T13:51:31.000Z | 2019-03-30T13:51:31.000Z | source/_posts/2019-03-03-nlp-task1.md | MixLabPro/2019 | 3f09e8a2a763c429f8898125426dff395c4519a8 | [
"MIT"
] | 1 | 2021-05-08T05:47:08.000Z | 2021-05-08T05:47:08.000Z | source/_posts/2019-03-03-nlp-task1.md | MixLabPro/2019 | 3f09e8a2a763c429f8898125426dff395c4519a8 | [
"MIT"
] | null | null | null | ---
title: "任务一-数据集探索"
categories: "NLP实践"
tags:
- ML
- NLP
comments: true
date: 2019-03-03 20:06:01
---
# 数据集
数据集:中、英文数据集各一份
## 中文数据集 THUCNews
THUCNews数据子集:[https://pan.baidu.com/s/1hugrfRu](https://pan.baidu.com/s/1hugrfRu) 密码:qfud
## 英文数据集 IMDB数据集
IMDB数据集 [Sentiment Analysis](http://ai.stanford.edu/~amaas/data/sentiment/)
<!--more-->
# IMDB数据集下载和探索
参考TensorFlow官方教程:[影评文本分类 | TensorFlow](https://tensorflow.google.cn/tutorials/keras/basic_text_classification)
[科赛 - Kesci.com](https://www.kesci.com/home/project/5b6c05409889570010ccce90)
# THUCNews数据集下载和探索
参考博客中的数据集部分和预处理部分:[CNN字符级中文文本分类-基于TensorFlow实现 - 一蓑烟雨 - CSDN博客](https://blog.csdn.net/u011439796/article/details/77692621)
参考代码:[text-classification-cnn-rnn/cnews_loader.py at mas...](https://github.com/gaussic/text-classification-cnn-rnn/blob/master/data/cnews_loader.py)
# 学习召回率、准确率、ROC曲线、AUC、PR曲线这些基本概念
* TP : 把正类预测为正类¶
* FP : 把负类预测为正类
* TN : 把正类预测为负类
* FN : 把负类预测为负类
* 准确率(accuracy) = (TP + TN) / (TP + FN + FP + TN)
* 精确率(precision) = TP / (TP + FP)
* 召回率(recall) = TP / (TP + FN)
* ROC曲线(受试者工作特征曲线):横轴为FPR,纵轴为TPR。
* FPR = FP / (FP + TN) TPR = TP / (TP + FN)
* AUC(Area under curve):ROC曲线下的面积
AUC的值一般在0.5-1之间,小于0.5表示分类器比随机分还要差。
* PR曲线展示的是准确率和召回率的曲线,PR曲线与ROC曲线的相同点是都采用了TPR (Recall),都可以用AUC来衡量分类器的效果。不同点是ROC曲线使用了FPR,而PR曲线使用了Precision,因此PR曲线的两个指标都聚焦于正例。类别不平衡问题中由于主要关心正例,所以在此情况下PR曲线被广泛认为优于ROC曲线
# 参考
[机器学习之类别不平衡问题 (2) —— ROC和PR曲线_慕课手记](https://www.imooc.com/article/48072)
| 26.527273 | 162 | 0.732694 | yue_Hant | 0.911553 |
0c908199e8a7a3f318439a9b2289df7cb75022aa | 5,685 | md | Markdown | README.md | steedos/Rocket.Chat.Electron | c779740cfd4df07c1dd780029c6f0cf7d8c3b7e8 | [
"MIT",
"Unlicense"
] | null | null | null | README.md | steedos/Rocket.Chat.Electron | c779740cfd4df07c1dd780029c6f0cf7d8c3b7e8 | [
"MIT",
"Unlicense"
] | null | null | null | README.md | steedos/Rocket.Chat.Electron | c779740cfd4df07c1dd780029c6f0cf7d8c3b7e8 | [
"MIT",
"Unlicense"
] | 1 | 2021-07-27T06:31:26.000Z | 2021-07-27T06:31:26.000Z | 
Rocket.Chat Native Cross-Platform Desktop Application via Electron
#### Downloads
Follow the link to [download the installer](https://github.com/RocketChat/Rocket.Chat.Electron/releases)
#### Contributions
Useful resources for creating apps with Electron
https://github.com/sindresorhus/awesome-electron
# Quick start
The only development dependency of this project is [Node.js](https://nodejs.org). So just make sure you have it installed.
Then type few commands known to every Node developer...
```
npm install electron-prebuilt --save-dev
git clone https://github.com/RocketChat/Rocket.Chat.Electron.git
cd Rocket.Chat.Electron
npm install
npm start
```
... and boom! You have running desktop application on your screen.
# Structure of the project
There are **two** `package.json` files:
#### 1. For development
Sits on path: `Rocket.Chat.Electron/package.json`. Here you declare dependencies for your development environment and build scripts. **This file is not distributed with real application!**
Also here you declare the version of Electron runtime you want to use:
```json
"devDependencies": {
"electron-prebuilt": "^0.24.0"
}
```
#### 2. For your application
Sits on path: `Rocket.Chat.Electron/app/package.json`. This is **real** manifest of your application. Declare your app dependencies here.
### Project's folders
- `app` - code of your application goes here.
- `config` - place for you to declare environment specific stuff.
- `build` - in this folder lands built, runnable application.
- `releases` - ready for distribution installers will land here.
- `resources` - resources for particular operating system.
- `tasks` - build and development environment scripts.
# Development
#### Installation
```
npm install
```
It will also download Electron runtime, and install dependencies for second `package.json` file inside `app` folder.
#### Starting the app
```
npm start
```
#### Adding pure-js npm modules to your app
Remember to add your dependency to `app/package.json` file, so do:
```
cd app
npm install name_of_npm_module --save
```
#### Adding native npm modules to your app
If you want to install native module you need to compile it agains Electron, not Node.js you are firing in command line by typing `npm install` [(Read more)](https://github.com/atom/electron/blob/master/docs/tutorial/using-native-node-modules.md).
```
npm run app-install -- name_of_npm_module
```
Of course this method works also for pure-js modules, so you can use it all the time if you're able to remember such an ugly command.
#### Module loader
How about splitting your JavaScript code into modules? This project supports it by new ES6 syntax (thanks to [babel](https://babeljs.io/)). ES6 modules are translated into AMD (RequireJS) modules. The main advantage of this setup is that you can use ES6 -> RequireJS for your own modules, and at the same time have normal access to node's `require()` to obtain stuff from npm.
```javascript
// Modules you write are required through new ES6 syntax
// (It will be translated into AMD definition).
import myOwnModule from './my_own_module';
// Node.js (npm) modules are required the same way as always
// (so you can still access all the goodness of npm).
var moment = require('moment');
```
#### Unit tests
Rocket.Chat.Electron has preconfigured [jasmine](http://jasmine.github.io/2.0/introduction.html) unit test runner. To run it go with standard:
```
npm test
```
You don't have to declare paths to spec files in any particular place. The runner will search through the project for all `*.spec.js` files and include them automatically.
# Making a release
**Note:** There are various icon and bitmap files in `resources` directory. Those are used in installers and are intended to be replaced by your own graphics.
To make ready for distribution installer use command:
```
npm run release
```
It will start the packaging process for operating system you are running this command on. Ready for distribution file will be outputted to `releases` directory.
You can create Windows installer only when running on Windows, the same is true for Linux and OSX. So to generate all three installers you need all three operating systems.
## Special precautions for Windows
As installer [NSIS](http://nsis.sourceforge.net/Main_Page) is used. You have to install it (version 3.0), and add NSIS folder to PATH in Environment Variables, so it is reachable to scripts in this project (path should look something like `C:/Program Files (x86)/NSIS`).
# License
The MIT License (MIT)
Copyright (c) 2015 Jakub Szwacz
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
| 38.412162 | 376 | 0.764644 | eng_Latn | 0.978794 |
0c90c46b214c78017ce78f6100db689275f3e854 | 412 | md | Markdown | sdk-api-src/content/bits10_1/index.md | amorilio/sdk-api | 54ef418912715bd7df39c2561fbc3d1dcef37d7e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | sdk-api-src/content/bits10_1/index.md | amorilio/sdk-api | 54ef418912715bd7df39c2561fbc3d1dcef37d7e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | sdk-api-src/content/bits10_1/index.md | amorilio/sdk-api | 54ef418912715bd7df39c2561fbc3d1dcef37d7e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
UID: NA:bits10_1
title: Bits10_1.h header
ms.assetid: 6509c3e0-6a83-3d99-818c-d122c0eef2c4
ms.date: 01/11/2019
ms.keywords:
ms.topic: conceptual
tech.root: bits
f1_keywords:
- bits10_1
- bits10_1/bits10_1
---
# Bits10_1.h header
## -description
This header is used by Background Intelligent Transfer Service. For more information, see:
- [Background Intelligent Transfer Service](../_bits/index.md)
| 17.913043 | 90 | 0.754854 | eng_Latn | 0.530556 |
0c91d59ef3b341ad53d099c50dbf3822dcf3acbd | 62 | md | Markdown | README.md | priya55612/Object_Detection | 2a4f012dfc9d744cabb3e57368daaa61ca0a519b | [
"MIT"
] | null | null | null | README.md | priya55612/Object_Detection | 2a4f012dfc9d744cabb3e57368daaa61ca0a519b | [
"MIT"
] | null | null | null | README.md | priya55612/Object_Detection | 2a4f012dfc9d744cabb3e57368daaa61ca0a519b | [
"MIT"
] | null | null | null | # Object_Detection
Implementation of an Object Detection Demo
| 20.666667 | 42 | 0.854839 | eng_Latn | 0.8171 |
0c92405e877af1f6bed30f48a7cd55da19b8ecfb | 1,685 | md | Markdown | content/publication/2021-01-01_Memories_of_State_Te.md | nataliariquelme/publicaciones | e69bb7fab03783d2da04d8065ef41e8bfa68f865 | [
"MIT"
] | null | null | null | content/publication/2021-01-01_Memories_of_State_Te.md | nataliariquelme/publicaciones | e69bb7fab03783d2da04d8065ef41e8bfa68f865 | [
"MIT"
] | 43 | 2022-01-21T02:56:28.000Z | 2022-01-21T20:54:18.000Z | content/publication/2021-01-01_Memories_of_State_Te.md | nataliariquelme/publicaciones | e69bb7fab03783d2da04d8065ef41e8bfa68f865 | [
"MIT"
] | 1 | 2021-12-27T16:41:26.000Z | 2021-12-27T16:41:26.000Z | +++
title = "Memories of State Terrorism in Chile: Dark Ruins at Villa Grimaldi"
date = "2021-01-01"
authors = ["Carolina Aguilera"]
publication_types = ["2"]
publication = "Space and Culture 12063312211066563. SAGE Publications Inc https://doi.org/10.1177/12063312211066563"
publication_short = "Space and Culture 12063312211066563. SAGE Publications Inc https://doi.org/10.1177/12063312211066563"
abstract = "In this short essay, I explore the recent reassessment of ruined sites haunted by the echoes of State terrorism across the Southern Cone of Latin America, asking what is at stake in the conservation of former detention centers and focusing on Villa Grimaldi in Chile. The site was initially transformed into a green park but has subsequently become a museum in which remains of the original buildings and artifacts from the repressive past are publicly accessible. I draw on perspectives that claim that even ruins that portray past acts of inhumanity do not necessarily need to evoke melancholic or traumatic retrospection; rather, they are sites of alternative pasts and futures. The transition from the original green park design to a more prominent use of the ruins speaks of an invitation to reassess the past, addressing marginal aspects of emblematic memories, including the political conflict that underpinned the repression."
abstract_short = ""
url_source = "https://doi.org/10.1177/12063312211066563"
tags = ["Conflict"]
url_code = ""
image_preview = ""
selected = false
projects = []
url_pdf = ""
url_preprint = ""
url_dataset = ""
url_project = ""
url_slides = ""
url_video = ""
url_poster = ""
math = true
highlight = true
[header]
image = ""
caption = ""
+++
| 58.103448 | 946 | 0.781009 | eng_Latn | 0.989479 |
0c9249289e53bc739452d53da6d2a6c70c54d349 | 1,978 | md | Markdown | ricette/query/Associare_nome_regione_a_punti.md | gimmybruce/tansignari | f98e3096e74e77561754d75b1789e0ce25867cee | [
"CC-BY-4.0"
] | null | null | null | ricette/query/Associare_nome_regione_a_punti.md | gimmybruce/tansignari | f98e3096e74e77561754d75b1789e0ce25867cee | [
"CC-BY-4.0"
] | 1 | 2019-03-07T12:59:34.000Z | 2019-03-07T12:59:34.000Z | ricette/query/Associare_nome_regione_a_punti.md | gimmybruce/tansignari | f98e3096e74e77561754d75b1789e0ce25867cee | [
"CC-BY-4.0"
] | null | null | null |
# Associare il nome delle regioni ISTAT a dei punti demanio
- autore: _Totò Fiandaca_
<!-- TOC -->
- [Associare il nome delle regioni ISTAT a dei punti demanio](#associare-il-nome-delle-regioni-istat-a-dei-punti-demanio)
- [Dataset](#dataset)
- [Procedimento](#procedimento)
- [Query spaziale](#query-spaziale)
- [Mapshaper](#mapshaper)
- [Installare mapshaper](#installare-mapshaper)
- [Cosa fa questa ricetta](#cosa-fa-questa-ricetta)
<!-- /TOC -->
---
## Dataset
* [01_demanio.csv](https://gist.github.com/aborruso/503df6c6477c341431e23bc51bc37149/raw/7aac29415b99512758acffd05fa463081f011484/01_demanio.csv)
* [regioni.shp](https://www4.istat.it/it/archivio/209722)
## Procedimento
Dopo aver scaricato i due dataset occorre convertirli in shapefile oppure importarli in un database sqlite (con estensione spaziale), verificare che I DUE STRATI abbiano stesso EPSG (sistema di riferimento delle coordinate)
## Query spaziale
```SQL
CREATE TABLE opendemanio_3857_reg AS
SELECT a.*, r.DEN_REG
FROM opendemanio_3857 a, regioni_3857 r
WHERE ST_Intersects(a.geometry,r.geometry)
AND a.rowid IN (SELECT rowid
FROM SpatialIndex
WHERE f_table_name = 'opendemanio_3857' AND search_frame = r.geometry);
```
## Mapshaper
```
mapshaper opendemanio_3857.shp -join regioni_3857.shp fields=COD_REG,DEN_REG -o opendemanio_v01.csv
```
### Installare mapshaper
```
## per installarlo
sudo npm install -g mapshapersudo apt-get update
sudo apt-get install nodejs
sudo apt-get install npm
sudo npm install -g mapshaper
## per aggiornarlo:
sudo npm update -g mapshaper
```
## Cosa fa questa ricetta
Associa il nome delle regioni italiane ISTAT ai punti (demanio) che vi ricadono dentro; la query spaziale si puo' usare sia in [`spatialite_gui`](http://www.gaia-gis.it/gaia-sins/windows-bin-NEXTGEN-amd64/) che nella riga di comando, mentre la ricetta di [`mapshaper`](https://github.com/mbloch/mapshaper/wiki/Command-Reference) solo da riga di comando.
| 29.969697 | 353 | 0.757836 | ita_Latn | 0.746003 |
0c936eb02fa9a8a815ea8928bcfb2abc74cf0717 | 1,419 | md | Markdown | rules/do-you-know-the-6-stages-in-the-sales-pipeline/rule.md | chrishoogwerf/SSW.Rules.Content | 846d2c381dc9b1aabdf1a5bc41c5efd816699f09 | [
"CC0-1.0"
] | null | null | null | rules/do-you-know-the-6-stages-in-the-sales-pipeline/rule.md | chrishoogwerf/SSW.Rules.Content | 846d2c381dc9b1aabdf1a5bc41c5efd816699f09 | [
"CC0-1.0"
] | null | null | null | rules/do-you-know-the-6-stages-in-the-sales-pipeline/rule.md | chrishoogwerf/SSW.Rules.Content | 846d2c381dc9b1aabdf1a5bc41c5efd816699f09 | [
"CC0-1.0"
] | null | null | null | ---
type: rule
archivedreason:
title: Do you know the 6 stages in the Sales Pipeline?
guid: 81b66890-b893-44ab-8890-45154df4d4eb
uri: do-you-know-the-6-stages-in-the-sales-pipeline
created: 2012-08-30T13:06:58.0000000Z
authors:
- title: Adam Cogan
url: https://ssw.com.au/people/adam-cogan
related: []
redirects: []
---
Any opportunity that has not yet been converted to a sale will be at one of the following 6 stages:
<!--endintro-->
1. Initial Phone Call
* The client has made contact but no initial meeting has yet been made
2. Initial Meeting – Booked
* You've arranged an initial meeting and it's booked in
3. Follow Up Meeting – Booked
* In some cases, more than one initial meeting may be required before work or speccing commences
4. Spec Review Proposal – Waiting for Approval
* After the initial meeting, if the work requires it, a specification review is proposed
5. Spec Review – Booked
* The speccing phase has been approved and booked in
6. Project Proposal – Waiting for Approval
* After the spec review, the client has been given a proposal for a chunk of work. Once this is approved, the opportunity is closed as won
The old Sales Pipeline was 9 steps, whereas this new one is 6 steps.
::: bad

:::
::: good

:::
| 31.533333 | 142 | 0.732911 | eng_Latn | 0.997939 |
0c941f19b802487f6d8c195cc72ca33e06e7998c | 2,311 | md | Markdown | AlchemyInsights/intune-ios-ade-sync-errors.md | isabella232/OfficeDocs-AlchemyInsights-pr.fi-FI | d4c2084ab362e84a16286d8b4e6a1e137b271ce6 | [
"CC-BY-4.0",
"MIT"
] | 2 | 2020-05-19T19:06:30.000Z | 2020-09-17T11:26:00.000Z | AlchemyInsights/intune-ios-ade-sync-errors.md | MicrosoftDocs/OfficeDocs-AlchemyInsights-pr.fi-FI | d98cfb244ae52a6624ec7fb9c7fc1092811bdfb7 | [
"CC-BY-4.0",
"MIT"
] | 2 | 2022-02-09T06:50:18.000Z | 2022-02-09T06:50:31.000Z | AlchemyInsights/intune-ios-ade-sync-errors.md | isabella232/OfficeDocs-AlchemyInsights-pr.fi-FI | d4c2084ab362e84a16286d8b4e6a1e137b271ce6 | [
"CC-BY-4.0",
"MIT"
] | 2 | 2019-10-11T19:13:35.000Z | 2021-10-09T10:47:01.000Z | ---
title: Applen automaattisen laitteen rekisteröinnin synkronointivirheet
ms.author: v-jmathew
author: v-jmathew
manager: scotv
audience: Admin
ms.topic: article
ms.service: o365-administration
ROBOTS: NOINDEX, NOFOLLOW
localization_priority: Normal
ms.collection: Adm_O365
ms.custom:
- "9000654"
- "7256"
ms.openlocfilehash: 1664a26b313c4a38c9c6d78cdb89997749ba175fd3dd72f278e99bbd50b0ee84
ms.sourcegitcommit: b5f7da89a650d2915dc652449623c78be6247175
ms.translationtype: MT
ms.contentlocale: fi-FI
ms.lasthandoff: 08/05/2021
ms.locfileid: "54013745"
---
# <a name="apple-automatic-device-enrollment-sync-errors"></a>Applen automaattisen laitteen rekisteröinnin synkronointivirheet
"Olemme havainneet, että sinulla on yksi tai useampi ADE/DEP-tunnus, joiden virhetila on. Ennen kuin kunkin tunnuksen virhetila on ratkaistu, ADE-toiminto ei toimi odotetulla tavalla.".
Tämä virhe voi ilmetä monin eri tavoin, kuten:
1. Laitteet eivät ehkä synkronoidu ABM/ASM:stä Intuneen
2. Rekisteröintiprofiilimääritykset saattavat epäonnistua
3. Laitteet eivät ehkä suorita ADE-rekisteröintiä onnistuneesti
Tarkista Intune-konsolissa ilmoitettu synkronointivirhe kohdassa Laitteet > laitteen rekisteröinti **> Apple-> rekisteröintiohjelman tunnuksia.**
Yksi synkronointivirheen yleisimpiä syitä on nykyisen tunnuksen vanheneminen. Monissa tapauksissa ongelma ratkeaa, jos tunnus on uusittava.
Jos vähintään yksi tunnuksistasi on vanhentunut, tutustu seuraaviin ohjeisiin, joiden avulla voit uusia ne tarpeen mukaan:
[Automaattisen laitteen rekisteröintitunnuksen uusiminen](https://docs.microsoft.com/mem/intune/enrollment/device-enrollment-program-enroll-ios#renew-an-automated-device-enrollment-token)
Seuraavista ohjeista näet myös muut tunnuksen synkronointivirheitä aiheuttavat virheet:
[ABM-/ASM-synkronointivirheet iOS-/iPadOS- ja macOS-laitteiden automaattisen rekisteröinnin tunnuksille](https://docs.microsoft.com/mem/intune/enrollment/troubleshoot-ios-enrollment-errors#sync-token-errors-between-intune-and-ade-dep)
[ABM-/ASM-synkronointivirheet iOS-/iPadOS- ja macOS-laitteiden automaattisen rekisteröinnin tunnuksille](https://docs.microsoft.com/mem/intune/enrollment/troubleshoot-ios-enrollment-errors#resolutions-when-syncing-tokens-between-intune-and-abmasm-for-automated-device-enrollment)
| 45.313725 | 279 | 0.836867 | fin_Latn | 0.996506 |
0c9449f42a2a9a9a5d9a706ff41ec92967d3e8f3 | 592 | md | Markdown | specs.md | Nancy-codergirl/Blogsite | 42333293718ab961c32cef2ac5fd67ed9d854044 | [
"MIT"
] | null | null | null | specs.md | Nancy-codergirl/Blogsite | 42333293718ab961c32cef2ac5fd67ed9d854044 | [
"MIT"
] | null | null | null | specs.md | Nancy-codergirl/Blogsite | 42333293718ab961c32cef2ac5fd67ed9d854044 | [
"MIT"
] | null | null | null | # Blogsite
This is an inspiration blog
#### PROJECT SPECIFICATIONS
* Good internet connection.
* Ensure both virtual and python are installed on your machine.
* Work on virtual
<br/>
User Requirements:
1. Users should be able to view blogs.
2. Users should be able to subscribe to blog updates.
3. Users should be able to comment and view their comments.
4. Writers should be able to login to the site.
5. Writers should be able to write a blog.
6. Writers should be able to view blogs.
7. Writers should be able to view comments on blogs.
8. Writers should be able to update their profile. | 32.888889 | 63 | 0.766892 | eng_Latn | 0.994725 |
0c947cca6173a2e63f111f64a723e04e50699637 | 182 | md | Markdown | README.md | IDHadare/Customer-tweet-analysis | 6790477a623c6bc9cb384a607126aca6f1105798 | [
"Apache-2.0"
] | null | null | null | README.md | IDHadare/Customer-tweet-analysis | 6790477a623c6bc9cb384a607126aca6f1105798 | [
"Apache-2.0"
] | null | null | null | README.md | IDHadare/Customer-tweet-analysis | 6790477a623c6bc9cb384a607126aca6f1105798 | [
"Apache-2.0"
] | null | null | null | # Customer-tweet-analysis
The goal of this work was to analyze the tweets of customers from different companies in order to classify them by group by degree of relevance and others
| 60.666667 | 155 | 0.813187 | eng_Latn | 0.999989 |
0c94b8b23320cb1c412dc4fbaaad1bcd99d070f9 | 107 | md | Markdown | _products/_defaults.md | mattstyles333/glasses | 71843189781f73e6337392b41ecf94edd73b0e69 | [
"MIT"
] | 1 | 2020-03-16T12:12:22.000Z | 2020-03-16T12:12:22.000Z | _products/_defaults.md | mattstyles333/glasses | 71843189781f73e6337392b41ecf94edd73b0e69 | [
"MIT"
] | 4 | 2019-08-29T18:33:51.000Z | 2022-02-26T05:33:49.000Z | _products/_defaults.md | mattstyles333/glasses | 71843189781f73e6337392b41ecf94edd73b0e69 | [
"MIT"
] | null | null | null | ---
name:
type: glasses
brand:
price:
sku:
description:
sizes:
styles:
- name:
color:
image:
---
| 7.642857 | 13 | 0.598131 | eng_Latn | 0.80065 |
0c94ee95fbf0dc0daf9bb1551ebe99a6c599661e | 579 | md | Markdown | api/Excel.Top10.Parent.md | skucab/VBA-Docs | 2912fe0343ddeef19007524ac662d3fcb8c0df09 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-04-08T20:10:22.000Z | 2021-04-08T20:10:22.000Z | api/Excel.Top10.Parent.md | skucab/VBA-Docs | 2912fe0343ddeef19007524ac662d3fcb8c0df09 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2019-04-02T13:17:46.000Z | 2019-04-02T13:17:46.000Z | api/Excel.Top10.Parent.md | skucab/VBA-Docs | 2912fe0343ddeef19007524ac662d3fcb8c0df09 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-09-28T07:45:29.000Z | 2021-09-28T07:45:29.000Z | ---
title: Top10.Parent property (Excel)
keywords: vbaxl10.chm821075
f1_keywords:
- vbaxl10.chm821075
ms.prod: excel
api_name:
- Excel.Top10.Parent
ms.assetid: 8c5acd64-8f29-fc28-ed5d-4947e0f1be53
ms.date: 06/08/2017
localization_priority: Normal
---
# Top10.Parent property (Excel)
Returns the parent object for the specified object. Read-only.
## Syntax
_expression_.**Parent**
_expression_ A variable that represents a [Top10](./Excel.Top10.md) object.
## See also
[Top10 Object](Excel.Top10.md)
[!include[Support and feedback](~/includes/feedback-boilerplate.md)] | 18.09375 | 75 | 0.758204 | eng_Latn | 0.602621 |
0c950af818aa2a741e86498fe78405c378291b42 | 1,249 | md | Markdown | docs/connect/jdbc/reference/sqlserverresource-class.md | Philippe-Geiger/sql-docs.fr-fr | 7fe32a3b70e9219529d5b00725233abf9d5982f6 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/connect/jdbc/reference/sqlserverresource-class.md | Philippe-Geiger/sql-docs.fr-fr | 7fe32a3b70e9219529d5b00725233abf9d5982f6 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/connect/jdbc/reference/sqlserverresource-class.md | Philippe-Geiger/sql-docs.fr-fr | 7fe32a3b70e9219529d5b00725233abf9d5982f6 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
description: Classe SQLServerResource
title: Classe SQLServerResource | Microsoft Docs
ms.custom: ''
ms.date: 01/19/2017
ms.prod: sql
ms.prod_service: connectivity
ms.reviewer: ''
ms.technology: connectivity
ms.topic: conceptual
ms.assetid: e7e362d1-6b5f-4e8c-8862-2001102cf4f9
author: David-Engel
ms.author: v-daenge
ms.openlocfilehash: 76aaeeb21a6b4a72a31d2437628d8bf397fee9ee
ms.sourcegitcommit: e700497f962e4c2274df16d9e651059b42ff1a10
ms.translationtype: HT
ms.contentlocale: fr-FR
ms.lasthandoff: 08/17/2020
ms.locfileid: "88458251"
---
# <a name="sqlserverresource-class"></a>Classe SQLServerResource
[!INCLUDE[Driver_JDBC_Download](../../../includes/driver_jdbc_download.md)]
Représente une ressource de chaîne d'erreur localisée. Cette classe est réservée à l'usage interne uniquement.
**Package :** com.microsoft.sqlserver.jdbc
**Étend :** java.util.ListResourceBundle
## <a name="syntax"></a>Syntaxe
```
public class SQLServerResource
```
## <a name="see-also"></a>Voir aussi
[SQLServerResource, membres](../../../connect/jdbc/reference/sqlserverresource-members.md)
[Informations de référence sur l'API du pilote JDBC](../../../connect/jdbc/reference/jdbc-driver-api-reference.md)
| 29.738095 | 117 | 0.748599 | fra_Latn | 0.201666 |
0c95c0584877d36d419fa73a1f2681a15f0bb939 | 591 | md | Markdown | Reviews/246 - Development of Services DAO Platform/3 - Milestone 3/assets/build/build-DAO_NotificationService.md | yunusem/dxd_codereview | 973c13ec120ec1336fd7c28be0ff5507652e0448 | [
"Apache-2.0"
] | null | null | null | Reviews/246 - Development of Services DAO Platform/3 - Milestone 3/assets/build/build-DAO_NotificationService.md | yunusem/dxd_codereview | 973c13ec120ec1336fd7c28be0ff5507652e0448 | [
"Apache-2.0"
] | null | null | null | Reviews/246 - Development of Services DAO Platform/3 - Milestone 3/assets/build/build-DAO_NotificationService.md | yunusem/dxd_codereview | 973c13ec120ec1336fd7c28be0ff5507652e0448 | [
"Apache-2.0"
] | null | null | null | Microsoft (R) Build Engine version 16.7.2+b60ddb6f4 for .NET
Copyright (C) Microsoft Corporation. All rights reserved.
Determining projects to restore...
Restored /workspace/ServicesDAO/DAO_NotificationService/DAO_NotificationService.csproj (in 811 ms).
1 of 2 projects are up-to-date for restore.
Helpers -> /workspace/ServicesDAO/Helpers/bin/Debug/netcoreapp3.1/Helpers.dll
DAO_NotificationService -> /workspace/ServicesDAO/DAO_NotificationService/bin/Debug/netcoreapp3.1/DAO_NotificationService.dll
Build succeeded.
0 Warning(s)
0 Error(s)
Time Elapsed 00:00:02.61
| 39.4 | 127 | 0.790186 | yue_Hant | 0.71755 |
0c95c6eb1ecd59d35ef5bdabeec3cc90d4f44143 | 15 | md | Markdown | packages/pdfeasy-docs/document/images.md | Novout/pdfeasy | 2c9eb25a428b9b669da1616900d7317e538f64e5 | [
"MIT"
] | null | null | null | packages/pdfeasy-docs/document/images.md | Novout/pdfeasy | 2c9eb25a428b9b669da1616900d7317e538f64e5 | [
"MIT"
] | 6 | 2022-02-09T00:00:53.000Z | 2022-02-22T10:23:57.000Z | packages/pdfeasy-docs/document/images.md | Novout/pdfeasy | 2c9eb25a428b9b669da1616900d7317e538f64e5 | [
"MIT"
] | null | null | null | # Images
TODO
| 3.75 | 8 | 0.666667 | kor_Hang | 0.437662 |
0c96305a429ecde97275e384ca12bdacba59ef40 | 913 | md | Markdown | dxtbx/README.md | rimmartin/cctbx_project | 644090f9432d9afc22cfb542fc3ab78ca8e15e5d | [
"BSD-3-Clause-LBNL"
] | 2 | 2021-03-18T12:31:57.000Z | 2022-03-14T06:27:06.000Z | dxtbx/README.md | rimmartin/cctbx_project | 644090f9432d9afc22cfb542fc3ab78ca8e15e5d | [
"BSD-3-Clause-LBNL"
] | null | null | null | dxtbx/README.md | rimmartin/cctbx_project | 644090f9432d9afc22cfb542fc3ab78ca8e15e5d | [
"BSD-3-Clause-LBNL"
] | 1 | 2020-02-04T15:39:06.000Z | 2020-02-04T15:39:06.000Z | ## Diffraction Experiment Toolbox
This code has now moved into a separate repository.
If you set up your build environment after April 2019 it should already include
the necessary changes.
### Updating developer installations
You can pick up dxtbx from its new location by running the command
```bash
libtbx.install dxtbx
```
which will install dxtbx in the modules/ directory. You should also run
```bash
cd $(libtbx.find_in_repositories cctbx_project)
git clean -diX dxtbx
```
to remove any `.pyc` and other python runtime debris from the previous dxtbx
path, as these files could interfere with python package discovery.
The clean command will ask for confirmation (enter `1`) before deleting files.
### Code migration
If you have any cctbx\_project development branches that touch the dxtbx
directory and need help in transferring commits over, please open an issue on
Github and we are happy to help.
| 31.482759 | 79 | 0.7908 | eng_Latn | 0.999402 |
0c974cba02f30de15aee76c3d3e5bc011cb599aa | 2,703 | md | Markdown | about.md | trellis-ldp/trellis-website | 58c139b00c3080574a5a1731ded81dce6985db70 | [
"Apache-2.0"
] | 2 | 2018-05-26T10:18:35.000Z | 2020-12-09T16:40:26.000Z | about.md | trellis-ldp/trellis-website | 58c139b00c3080574a5a1731ded81dce6985db70 | [
"Apache-2.0"
] | 13 | 2020-08-02T13:53:41.000Z | 2021-07-12T22:56:37.000Z | about.md | trellis-ldp/trellis-website | 58c139b00c3080574a5a1731ded81dce6985db70 | [
"Apache-2.0"
] | null | null | null | ---
layout: page
title: About
permalink: /about.html
---
Trellis is a [linked data](https://www.w3.org/TR/ldp/) server. As a consequence, it is especially
well suited for storing Web resources that _connect_ to other data on the Web. The
[Linked Data Platform Primer](https://www.w3.org/TR/ldp-primer/) is a good introduction to
these concepts.
Trellis is also part of a larger effort that aims to (re-)decentralize the Web, allowing
the creators of content to have true ownership over their data. Furthermore, Trellis is built
entirely on existing [Web standards](https://github.com/trellis-ldp/trellis/wiki/Web-Standards)
standards, so its interface is stable, predictable and well-understood.
## Community
**We welcome your contributions and ideas.** TrellisLDP is an open source project, and all of the code
is released under the Apache License, Version 2.0.
Contributions are always welcome.
What you can do:
* File a [bug or issue](https://github.com/trellis-ldp/trellis/issues)
* Join the [mailing list](https://groups.google.com/group/trellis-ldp)
* Try out the [latest version](download.html) of Trellis.
## Documentation
Information about specific features can be found on the
[project Wiki](https://github.com/trellis-ldp/trellis/wiki).
Developers interested in contributing or extending Trellis may find the
[JavaDocs](https://www.trellisldp.org/docs/trellis/current/apidocs/) useful.
## Semantic Web
The Linked Data Platform has its roots in the [Semantic Web](https://en.wikipedia.org/wiki/Semantic_Web),
and Trellis is built on many of these underlying ideas.
As such, [RDF](https://en.wikipedia.org/wiki/Resource_Description_Framework)
plays a central role in how resources are represented. The Trellis project defines
its own [RDF vocabulary](https://www.trellisldp.org/ns/trellis), which is used in certain
HTTP interactions with the server software. In addition, a [DOAP file](https://www.trellisldp.org/doap.ttl)
describes the project itself.
The Trellis docker containers are regularly checked against an [LDP Testsuite](https://github.com/trellis-ldp/ldp-testsuite).
The results are posted for [basic](https://www.trellisldp.org/ldp/report/basic.html) ([ttl](https://www.trellisldp.org/ldp/report/basic.ttl) | [jsonld](https://www.trellisldp.org/ldp/report/basic.jsonld)),
[direct](https://www.trellisldp.org/ldp/report/direct.html) ([ttl](https://www.trellisldp.org/ldp/report/direct.ttl) | [jsonld](https://www.trellisldp.org/ldp/report/direct.jsonld)) and
[indirect](https://www.trellisldp.org/ldp/report/indirect.html) ([ttl](https://www.trellisldp.org/ldp/report/indirect.ttl) | [jsonld](https://www.trellisldp.org/ldp/report/indirect.jsonld)) containers.
| 51 | 205 | 0.771365 | eng_Latn | 0.879337 |
0c9753c55c02610b70086c89708ac2932ca7dd23 | 6,446 | md | Markdown | guides/database-interaction-through-models/dirty-records.md | chapmandu/cfwheels | 4feb2eaa2256e0fad8d88826005b443d3eaabb42 | [
"Apache-2.0"
] | null | null | null | guides/database-interaction-through-models/dirty-records.md | chapmandu/cfwheels | 4feb2eaa2256e0fad8d88826005b443d3eaabb42 | [
"Apache-2.0"
] | null | null | null | guides/database-interaction-through-models/dirty-records.md | chapmandu/cfwheels | 4feb2eaa2256e0fad8d88826005b443d3eaabb42 | [
"Apache-2.0"
] | null | null | null | ---
description: How to track changes to objects in your application.
---
# Dirty Records
Wheels provides some very useful methods for tracking changes to objects. You might think, _Why do I need that? Won't I just know that I changed the object myself?_
Well, that depends on the structure of your code.
As you work with Wheels and move away from that procedural spaghetti mess you used to call code to a better, cleaner object-oriented approach, you may get a sense that you have lost control of what your code is doing. Your new code is creating objects, they in turn call methods on other objects automatically, methods are being called from multiple places, and so on. Don't worry though, this is a good thing. It just takes a while to get used to, and with the help of some Wheels functionality, it won't take you that long to get used to it either.
### An Example with Callbacks
One area where this sense of losing control is especially noticeable is when you are using _callbacks_ on objects (see the chapter on [Object Callbacks](https://guides.cfwheels.org/cfwheels-guides/database-interaction-through-models/object-callbacks) for more info). So let's use that for our example.
Let's say you have used a callback to specify that a method should be called whenever a `user` object is saved to the database. You won't know exactly **where** this method was called from. It could have been the user doing it themselves on the website, or it could have been done from your internal administration area. Generally speaking, you don't need to know this either.
One thing your business logic might need to know though is a way to tell exactly **what** was changed on the object. Maybe you want to handle things differently if the user's last name was changed than if the email address was changed, for example.
Let's look at the methods Wheels provide to make tracking these changes easier for you.
### Methods for Tracking Changes
Let's get to coding…
```javascript
post = model("post").findByKey(1);
result = post.hasChanged();
```
Here we are using the [hasChanged()](https://api.cfwheels.org/model.haschanged.html) method to see if any of the object properties has changed.
By the way, when we are talking about "change" in Wheels, we always mean whether or not an object's properties have changed compared to what is stored in the columns they map to in the database table.
In the case of the above example, the `result` variable will contain `false` because we just fetched the object from the database and did not make any changes to it at all.
Well, let's make a change then. If we didn't, this chapter wouldn't be all that interesting, would it?
```javascript
post.title = "A New Post Title";
result = post.hasChanged();
```
Now result will be `true` because what is stored in `post.title` differs from what is stored in the `title`column for this record in the `posts` table (well, unless the title was "A New Post Title" even before the change, in which case the result would still be `false`).
When calling [hasChanged()](https://api.cfwheels.org/model.haschanged.html) with no arguments, Wheels will check **all** properties on the object and return `true` if any of them have changed. If you want to see if a specific property has changed, you can pass in `property="title"` to it or use the dynamic method [XXXHasChanged()](https://api.cfwheels.org/model.haschanged.html). Replace `XXX` with the name of the property. In our case, the method would then be named `titleHasChanged()`.
If you want to see what a value was before a change was made, you can do so by calling [changedFrom()](https://api.cfwheels.org/model.changedfrom.html) and passing in the name of a property. This can also be done with the dynamic [XXXChangedFrom()](https://api.cfwheels.org/model.changedfrom.html) method.
When an object is in a changed state, there are a couple of methods you can use to report back on these changes. [changedProperties()](https://api.cfwheels.org/model.changedproperties.html) will give you a list of the property names that have been changed. [allChanges()](https://api.cfwheels.org/model.allchanges.html) returns a struct containing all the changes (both the property names and the changed values themselves).
If you have made changes to an object and for some reason you want to revert it back, you can do so by calling [reload()](https://api.cfwheels.org/model.reload.html) on it. This will query the database and update the object properties with their corresponding values from the database.
OK, let's save the object to the database now and see how that affects things.
```javascript
post.save();
result = post.hasChanged();
```
Now `result` will once again contain `false`. When you save a changed (a.k.a. "dirty") object, it clears out its changed state tracking and is considered unchanged again.
### Don't Forget the Context
All of the examples in this chapter look a little ridiculous because it doesn't make much sense to check the status of an object when you changed it manually in your code. As we said in the beginning of the chapter, when put into context of callbacks, multiple methods, etc., it will become clear how useful these methods really are.
### Internal Use of Change Tracking
It's worth noting here that Wheels makes good use of this change tracking internally as well. If you make changes to an object, Wheels is smart enough to only update the changed columns, leaving the rest alone. This is good for a number of reasons but perhaps most importantly for database performance. In high traffic web applications, the bottleneck is often the database, and anything that can be done to prevent unnecessary database access is a good thing.
### One "Gotcha" About Tracking Changes
If you create a brand new object with the [new()](https://api.cfwheels.org/model.new.html) method and call [hasChanged()](https://api.cfwheels.org/model.haschanged.html) on it, it will return `true`. The reason for this seemingly unexpected behavior is that change is always viewed from the database's perspective. The [hasChanged()](https://api.cfwheels.org/model.haschanged.html) method will return `true` in this case because it is different from what is stored in the database (i.e. it doesn't exist at all in the database yet).
If you would simply like to know if an object exists in the database or not, you can use the [isNew()](https://api.cfwheels.org/model.isnew.html) method.
| 83.714286 | 550 | 0.772107 | eng_Latn | 0.999545 |
0c98cb5b3a6f2116ee8a15a02fecc0ed7b788214 | 3,113 | md | Markdown | doc/source/api/VoxelMesherDMC.md | jacobcoughenour/godot_voxel | 42a8d0e35facf0af9012855f94b5f57a391fe8bd | [
"MIT"
] | null | null | null | doc/source/api/VoxelMesherDMC.md | jacobcoughenour/godot_voxel | 42a8d0e35facf0af9012855f94b5f57a391fe8bd | [
"MIT"
] | null | null | null | doc/source/api/VoxelMesherDMC.md | jacobcoughenour/godot_voxel | 42a8d0e35facf0af9012855f94b5f57a391fe8bd | [
"MIT"
] | null | null | null | # VoxelMesherDMC
Inherits: [VoxelMesher](VoxelMesher.md)
Implements isosurface generation (smooth voxels) using [Dual Marching Cubes](https://www.volume-gfx.com/volume-rendering/dual-marching-cubes/).
## Properties:
Type | Name | Default
------ | -------------------------------------- | --------
`int` | [geometric_error](#i_geometric_error) | 0
`int` | [mesh_mode](#i_mesh_mode) | 0
`int` | [seam_mode](#i_seam_mode) | 0
`int` | [simplify_mode](#i_simplify_mode) | 0
<p></p>
## Methods:
Return | Signature
----------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------------------------------------
[float](https://docs.godotengine.org/en/stable/classes/class_float.html) | [get_geometric_error](#i_get_geometric_error) ( ) const
[Dictionary](https://docs.godotengine.org/en/stable/classes/class_dictionary.html) | [get_statistics](#i_get_statistics) ( ) const
[void](#) | [set_geometric_error](#i_set_geometric_error) ( [float](https://docs.godotengine.org/en/stable/classes/class_float.html) error )
<p></p>
## Enumerations:
enum **MeshMode**:
- **MESH_NORMAL** = **0**
- **MESH_WIREFRAME** = **1**
- **MESH_DEBUG_OCTREE** = **2**
- **MESH_DEBUG_DUAL_GRID** = **3**
enum **SimplifyMode**:
- **SIMPLIFY_OCTREE_BOTTOM_UP** = **0**
- **SIMPLIFY_OCTREE_TOP_DOWN** = **1**
- **SIMPLIFY_NONE** = **2**
enum **SeamMode**:
- **SEAM_NONE** = **0**
- **SEAM_MARCHING_SQUARE_SKIRTS** = **1**
## Property Descriptions
- [int](https://docs.godotengine.org/en/stable/classes/class_int.html)<span id="i_geometric_error"></span> **geometric_error** = 0
- [int](https://docs.godotengine.org/en/stable/classes/class_int.html)<span id="i_mesh_mode"></span> **mesh_mode** = 0
- [int](https://docs.godotengine.org/en/stable/classes/class_int.html)<span id="i_seam_mode"></span> **seam_mode** = 0
- [int](https://docs.godotengine.org/en/stable/classes/class_int.html)<span id="i_simplify_mode"></span> **simplify_mode** = 0
## Method Descriptions
- [float](https://docs.godotengine.org/en/stable/classes/class_float.html)<span id="i_get_geometric_error"></span> **get_geometric_error**( )
- [Dictionary](https://docs.godotengine.org/en/stable/classes/class_dictionary.html)<span id="i_get_statistics"></span> **get_statistics**( )
- [void](#)<span id="i_set_geometric_error"></span> **set_geometric_error**( [float](https://docs.godotengine.org/en/stable/classes/class_float.html) error )
_Generated on Apr 10, 2021_
| 40.960526 | 216 | 0.519113 | yue_Hant | 0.498038 |
0c9927ffaafd7f925052bff72de3807799da9a30 | 7,019 | md | Markdown | experiments/1_fcrn_a/README.md | Tudor67/Object-Counting | b408ff5c7195ef64a7449460a00d6085ebaa7555 | [
"MIT"
] | 7 | 2019-08-13T08:15:38.000Z | 2021-11-15T10:26:32.000Z | experiments/1_fcrn_a/README.md | Tudor67/Object-Counting | b408ff5c7195ef64a7449460a00d6085ebaa7555 | [
"MIT"
] | 1 | 2019-03-26T15:51:57.000Z | 2019-03-26T22:45:54.000Z | experiments/1_fcrn_a/README.md | Tudor67/Object-Counting | b408ff5c7195ef64a7449460a00d6085ebaa7555 | [
"MIT"
] | 2 | 2020-01-27T12:54:19.000Z | 2020-07-18T06:05:32.000Z | # FCRN-A
# Results
* Mean Absolute Error (MAE), Root Mean Squared Error (RMSE), Underestimate (%U), Overestimate (%O) and Difference (%D) on counting datasets.
* Results are presented just for the test set.
## VGG Cells Dataset
| Method | Loss | Epochs | N | MAE | RMSE | %U | %O | %D |
| :--- | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: |
| FCRN-A, full images | MSE | ~100 | 32 | 6.0 ± 1.7 | 7.2 ± 1.8 | 0.6% ± 0.6% | 2.9% ± 1.4% | 3.5% ± 1.0% |
| FCRN-A, full images | MSE | ~100 | 64 | 5.4 ± 1.7 | 6.5 ± 1.9 | 2.5% ± 1.5% | 0.6% ± 0.7% | 3.1% ± 1.0% |
| FCRN-A, full images | MAE | ~100 | 32 | 8.3 ± 2.7 | 10.0 ± 3.1 | 4.6% ± 1.8% | 0.2% ± 0.3% | 4.8% ± 1.6% |
| FCRN-A, full images | MAE | ~100 | 64 | 6.6 ± 1.8 | 8.5 ± 2.1 | 3.7% ± 1.1% | 0.1% ± 0.1% | 3.9% ± 1.0% |
| FCRN-A, full images | LogCosh | ~100 | 32 | 8.3 ± 1.8 | 9.8 ± 1.9 | 2.4% ± 2.7% | 2.4% ± 2.1% | 4.8% ± 1.0% |
|`FCRN-A, full images` |`LogCosh`|`~100` |`64` |`3.6 ± 0.3`| `4.5 ± 0.4`|`0.9% ± 0.5%`|`1.2% ± 0.4%`|`2.1% ± 0.2%`|
| FCRN-A, patches 4 * (128x128) | MSE | ~100 | 32 | 5.5 ± 0.5 | 6.6 ± 0.6 | 0.9% ± 1.0% | 2.2% ± 1.1% | 3.2% ± 0.3% |
| FCRN-A, patches 4 * (128x128) | MSE | ~100 | 64 | 3.9 ± 1.1 | 4.8 ± 1.0 | 0.6% ± 0.4% | 1.7% ± 1.0% | 2.3% ± 0.7% |
| FCRN-A, patches 4 * (128x128) | MAE | ~100 | 32 | 6.2 ± 1.8 | 8.0 ± 1.8 | 3.4% ± 1.1% | 0.2% ± 0.1% | 3.6% ± 1.0% |
| FCRN-A, patches 4 * (128x128) | MAE | ~100 | 64 | 7.4 ± 1.0 | 9.2 ± 1.1 | 4.3% ± 0.6% | 0.0% ± 0.0% | 4.3% ± 0.6% |
| FCRN-A, patches 4 * (128x128) | LogCosh | ~100 | 32 | 5.0 ± 1.8 | 6.5 ± 2.5 | 1.7% ± 1.2% | 1.2% ± 0.7% | 2.9% ± 1.0% |
| FCRN-A, patches 4 * (128x128) | LogCosh | ~100 | 64 | 4.0 ± 0.9 | 5.3 ± 1.0 | 1.8% ± 0.9% | 0.5% ± 0.6% | 2.4% ± 0.5% |
* N - number of train images;
* Our implementation does not include data preprocessing and augmentation;
* Standard deviation corresponds to 5 different draws of training and validation sets;
* Counts per image: 174 ± 64.
## CARPK Dataset
| Method | Loss | Epochs | MAE | RMSE | %U | %O | %D |
| :--- | :---: | :---: | :---: | :---: | :---: | :---: | :---: |
| FCRN-A, full images | MSE | 15 | 21.15 | 26.34 | 13.07% | 7.38% | 20.45% |
| FCRN-A, full images | MAE | 1/15 | 95.54 | 103.78 | 92.30% | 0.02% | 92.32% |
| FCRN-A, full images | LogCosh | 14/15 | 23.34 | 29.65 | 20.41% | 2.15% | 22.56% |
| FCRN-A, patches 32 * (128x128)| MSE | 15 | 22.10 | 28.73 | 18.13% | 3.22% | 21.35% |
| FCRN-A, patches 32 * (128x128)| MAE | 3/15 |103.48 | 110.63 | 99.99% | 0.00% | 99.99% |
| FCRN-A, patches 32 * (128x128)| LogCosh | 15 | 26.19 | 31.63 | 23.81% | 1.50% | 25.31% |
* FCRN-A trained with full images or with 128x128 patches (MSE loss, 15 epochs) overfits the train set:
| Details | Split | Loss | Epochs | MAE | RMSE |
| :--- | :---: | :---: | :---: | :---: | :---: |
| full images | train | MSE | 15 | 5.32 | 5.90 |
| full images | test | MSE | 15 | 21.15 | 26.34 |
| patches 32 * (128x128)| train | MSE | 15 | 3.65 | 5.14 |
| patches 32 * (128x128)| test | MSE | 15 | 22.10 | 28.73 |
* FCRN-A trained with 32 * (128x128) patches (MSE loss):
| Epochs | Split | MAE | RMSE |
| :---: | :---: | :---: | :---: |
| 1 | train | 20.01 | 23.50 |
| 1 | test | 27.03 | 32.32 |
| 2 | train | 15.63 | 18.91 |
| 2 | test | 18.65 | 25.01 |
| 3 | train | 7.16 | 9.37 |
| 3 | test | 14.73 | 17.45 |
| 5 | train | 5.88 | 7.89 |
| 5 | test | 16.02 | 19.84 |
| 10 | train | 3.92 | 5.25 |
| 10 | test | 24.82 | 31.04 |
| 15 | train | 3.65 | 5.14 |
| 15 | test | 22.10 | 28.73 |
* FCRN-A trained with 32 * (128x128) patches (LogCosh loss):
| Epochs | Split | MAE | RMSE | %U | %O | %D |
| :---: | :---: | :---: | :---: | :---: | :---: | :---: |
| 1 | train | 36.18 | 38.86 | 85.06% | 0.00% | 85.06% |
| 1 | test | 24.44 | 33.19 | 22.30% | 1.32% | 23.62% |
| 2 | train | 8.69 | 11.89 | 17.42% | 3.01% | 20.43% |
| 2 | test | 17.33 | 21.74 | 7.19% | 9.56% | 16.75% |
| 3 | train | 8.56 | 10.85 | 3.15% | 16.96% | 20.11% |
| 3 | test | 17.14 | 21.35 | 9.12% | 7.44% | 16.56% |
| 4 | train | 6.38 | 7.55 | 2.87% | 12.12% | 14.99% |
| 4 | test | 14.44 | 18.18 | 4.49% | 9.46% | 13.95% |
| 5 | train | 5.38 | 7.20 | 4.35% | 8.30% | 12.65% |
|`5` |`test` |`12.13`|`15.72`| `5.62%`| `6.10%`|`11.72%`|
| 6 | train | 4.06 | 6.05 | 6.82% | 2.72% | 9.54% |
| 6 | test | 26.95 | 32.37 | 25.12% | 0.92% | 26.04% |
| 10 | train | 3.75 | 4.63 | 3.40% | 5.41% | 8.81% |
| 10 | test | 18.10 | 23.58 | 10.84% | 6.64% | 17.48% |
| 15 | train | 4.11 | 4.75 | 1.74% | 7.92% | 9.66% |
| 15 | test | 26.19 | 31.63 | 23.81% | 1.50% | 25.31% |
* Unstable training.
## ShanghaiTech (Part B) Dataset
| Method | Loss | Epochs | MAE | RMSE | %U | %O | %D |
| :--- | :---: | :---: | :---: | :---: | :---: | :---: | :---: |
| FCRN-A, full images | MSE | 5/20 | 52.95 | 74.53 | 19.08% | 23.73% | 42.81% |
| FCRN-A, full images | MSE | 3/100 | 53.27 | 67.10 | 11.61% | 31.45% | 43.06% |
| FCRN-A, full images | MAE | 51/100 | 123.70 | 155.97 | 100.00% | 0.00% | 100.00% |
|`FCRN-A, full images` |`LogCosh`|`24/30` | `19.95`| `33.94`| `11.70%`| `4.43%`| `16.13%`|
| FCRN-A, full images | LogCosh | 23/100 | 23.69 | 40.34 | 18.07% | 1.08% | 19.15% |
| FCRN-A, patches 32 * (128x128) | MSE | 54/100 | 21.49 | 34.98 | 9.33% | 8.04% | 17.37% |
| FCRN-A, patches 32 * (128x128) | MAE | 32/100 | 123.70 | 155.97 | 100.00% | 0.00% | 100.00% |
| FCRN-A, patches 32 * (128x128) | LogCosh | 86/100 | 20.81 | 38.11 | 15.44% | 1.38% | 16.82% |
* FCRN-A trained with 32 * (128x128) patches (MSE loss):
| Epochs | Split | MAE | RMSE | %U | %O | %D |
| :---: | :---: | :---: | :---: | :---: | :---: | :---: |
| 1/5 | train | 62.72 | 82.35 | 19.99% | 30.69% | 50.68% |
| 1/5 | test | 64.13 | 85.65 | 19.87% | 31.97% | 51.84% |
| 23/25 | train | 34.12 | 50.49 | 12.21% | 15.35% | 27.56% |
| 23/25 | test | 36.77 | 55.55 | 13.81% | 15.91% | 29.72% |
| 15/50 | train | 28.90 | 43.02 | 10.23% | 13.12% | 23.35% |
| 15/50 | test | 33.15 | 49.82 | 12.58% | 14.22% | 26.80% |
| 54/100 | train | 20.70 | 32.46 | 8.42% | 8.31% | 16.73% |
| 54/100 | test | 21.49 | 34.98 | 9.33% | 8.04% | 17.37% |
| 63.234234 | 140 | 0.403619 | yue_Hant | 0.392611 |
0c9a1fe6523a9b662ec1fa45e773f41b7dc33302 | 932 | md | Markdown | docs/vs-2015/code-quality/c28208.md | adrianodaddiego/visualstudio-docs.it-it | b2651996706dc5cb353807f8448efba9f24df130 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/vs-2015/code-quality/c28208.md | adrianodaddiego/visualstudio-docs.it-it | b2651996706dc5cb353807f8448efba9f24df130 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/vs-2015/code-quality/c28208.md | adrianodaddiego/visualstudio-docs.it-it | b2651996706dc5cb353807f8448efba9f24df130 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: C28208 | Microsoft Docs
ms.date: 11/15/2016
ms.prod: visual-studio-dev14
ms.technology: vs-ide-code-analysis
ms.topic: reference
f1_keywords:
- C28208
helpviewer_keywords:
- C28208
ms.assetid: e9a8ce37-3b05-4202-b078-5570ae496d1d
caps.latest.revision: 5
author: mikeblome
ms.author: mblome
manager: jillfra
ms.openlocfilehash: cf4fe198e5ca95a98375d6832f3352fda4e065d0
ms.sourcegitcommit: 8b538eea125241e9d6d8b7297b72a66faa9a4a47
ms.translationtype: MT
ms.contentlocale: it-IT
ms.lasthandoff: 01/23/2019
ms.locfileid: "58968811"
---
# <a name="c28208"></a>C28208
[!INCLUDE[vs2017banner](../includes/vs2017banner.md)]
avviso C28208: Funzione \<funzione > è stata precedentemente definita con un elenco di parametri diverso al \<file > (\<riga >). Alcuni strumenti di analisi produrrà risultati non corretti
Questo avviso viene segnalato quando definizione noto di una funzione non corrisponde a un'altra occorrenza.
| 32.137931 | 190 | 0.796137 | ita_Latn | 0.805276 |
0c9b7fb20e3518204ca553ed29f044ab1497d28b | 159 | md | Markdown | Server/shadowsocks/README.md | fangdaidai/Shadowsocks-For-WHMCS | 8c7eef8169c6a50cfed68a85d683744bfcd19428 | [
"MIT"
] | 2 | 2018-05-21T15:37:29.000Z | 2021-09-25T16:46:42.000Z | Server/shadowsocks/README.md | fangdaidai/Shadowsocks-For-WHMCS | 8c7eef8169c6a50cfed68a85d683744bfcd19428 | [
"MIT"
] | null | null | null | Server/shadowsocks/README.md | fangdaidai/Shadowsocks-For-WHMCS | 8c7eef8169c6a50cfed68a85d683744bfcd19428 | [
"MIT"
] | 1 | 2017-04-27T12:21:30.000Z | 2017-04-27T12:21:30.000Z | ## Shadowsocks Manyuser
This is a based python multi-user derivative version of shadowsocks.
Fork in https://github.com/mengskysama/shadowsocks/tree/manyuser
| 31.8 | 68 | 0.81761 | eng_Latn | 0.934927 |
0c9bbc74c7a4d63878fa9959ce3fce89c3840023 | 3,550 | md | Markdown | articles/financials/accounts-payable/vendor-workflow.md | changeworld/Dynamics-365-Operations.ar-sa | 17c72d21a1645bb93365b23013f821feedcd6430 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/financials/accounts-payable/vendor-workflow.md | changeworld/Dynamics-365-Operations.ar-sa | 17c72d21a1645bb93365b23013f821feedcd6430 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/financials/accounts-payable/vendor-workflow.md | changeworld/Dynamics-365-Operations.ar-sa | 17c72d21a1645bb93365b23013f821feedcd6430 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: "سير عمل المورّد"
description: "يمكنك تعديل معلومات المورّد واستخدام سير العمل للموافقة عليه."
author: mikefalkner
manager: annbe
ms.date: 08/24/2018
ms.topic: article
ms.prod:
ms.service: dynamics-ax-applications
ms.technology:
ms.search.form: Vendor
audience: Application User
ms.reviewer: shylaw
ms.search.scope: Core, Operations
ms.search.region: Global
ms.author: mikefalkner
ms.search.validFrom: 2018-08-30
ms.dyn365.ops.version: 8.0.4
ms.translationtype: HT
ms.sourcegitcommit: 98ed3378ab05c0c69c9e5b2a82310113a81c2264
ms.openlocfilehash: 950a1852acf9f3e4747ce2d55738c0eb3a646897
ms.contentlocale: ar-sa
ms.lasthandoff: 08/31/2018
---
# <a name="vendor-workflow"></a>سير عمل المورّد
[!include [banner](../includes/banner.md)]
عند استخدام سير عمل المورّد، يتم إرسال التغييرات التي يتم إدخالها على حقول محددة إلى سير العمل للموافقة عليها قبل إضافتها إلى المورّد.
## <a name="set-up-the-vendor-workflow"></a>إعداد سير عمل المورّد
قبل استخدام ميزة سير العمل، يجب تمكينها.
1. انتقل إلى **الحسابات الدائنة \> الإعداد \> محددات الحسابات الدائنة**.
2. على علامة التبويب **عام**، على علامة التبويب السريعة **موافقة المورّد**، عيّن الخيار **تمكين موافقات المورّدين** إلى **نعم**.
3. في الحقل **سلوك وحدة البيانات**، حدد السلوك الذي يجب استخدامه عند استيراد البيانات:
- **السماح بالتغييرات دون الموافقة** – بإمكان كيان البيانات تحديث سجل المورّد من دون معالجته عبر سير العمل.
- **رفض التغييرات** – لا يمكن إجراء تغييرات على سجل المورّد. سوف تفشل عملية الاستيراد للحقول التي تم تمكينها لسير العمل.
- **إنشاء مقترحات التغييرات** – سيتم تغيير كافة الحقول فيما عدا الحقول التي تم تمكينها لسير العمل. ستتم إضافة القيم الجديدة لهذه الحقول إلى المورّد كتغييرات مقترحة، وسيبدأ سير العمل بشكل تلقائي.
4. في قائمة حقول المورّد، حدد خانة الاختيار **تمكين** لكل حقل يجب أن تتم الموافقة عليه قبل إجراء التغييرات.
5. انتقل إلى **الحسابات الدائنة \> الإعداد \> عمليات سير عمل الحسابات الدائنة**.
6. حدد **جديد**.
7. حدد **سير عمل تغييرات الموردين المقترحة**.
8. قم بإعداد سير العمل بحيث يتوافق مع عملية الموافقة الخاصة بك. سيقوم عنصر الموافقة على سير العمل **موافقة سير العمل لتغيير المورد المقترح** بتطبيق التغييرات على المورّد.
## <a name="change-vendor-information-and-submit-the-changes-to-the-workflow"></a>تغيير معلومات المورّد وإرسال التغييرات إلى سير العمل
عندما تقوم بتغيير حقل تم تمكينه لسير العمل، تظهر صفحة **التغييرات المقترحة**. تعرض هذه الصفحة القيمة الأصلية للحقل والقيمة الجديدة التي قمت بإدخالها. ويتم عكس الحقل الذي قمت بتغييره إلى قيمته الأصلية. تظهر أيضًا رسالة حالة تعلمك بعدم إرسال تغييراتك.
في كل مرة تقوم فيها بتغيير حقل تم تمكينه لسير العمل، يُضاف هذا الحقل إلى القائمة في صفحة **التغييرات المقترحة**. لتجاهل القيمة المقترحة لأحد الحقول، استخدم الزر **تجاهل** الموجود إلى جانب الحقل في القائمة. لتجاهل كافة التغييرات، استخدم الزر **تجاهل كافة التغييرات** أسفل الصفحة. حدد **موافق** لإغلاق الصحة.
بعد إجراء تغيير مقترح واحد على الأقل، تظهر علامتا تبويب إضافيتان في جزء الإجراءات: **التغييرات المقترحة** و **سير العمل**.
1. حدد **التغييرات المقترحة** لفتح صفحة **التغييرات المقترحة** ومراجعة التغييرات التي أجريتها.
2. حدد **سير العمل \> لإرسال التغييرات إلى سير العمل**.
تتغير الحالة على الصفحة إلى **الموافقة على التغييرات المعلقة**.
يتبع سير العمل عملية سير العمل القياسية في Microsoft Dynamics 365 for Finance and Operations. يتم توجيه المعتمد إلى صحفة **المورّد**، حيث يمكنه مراجعة التغييرات في صفحة **التغييرات المقترحة** ثم تحديد **سير العمل \> الموافقة** للموافقة على سير العمل. بعد استكمال كافة الموافقات، يتم تحديث الحقول بالقيم التي اقترحتها.
| 53.787879 | 322 | 0.750141 | arb_Arab | 0.998631 |
0c9be2d381da2ec866f27a176896d0da675b4d3a | 1,844 | md | Markdown | README.md | cyrilschumacher/Persona-API | 301622be835b95e0846d486e5e5a4303981f404f | [
"Unlicense",
"MIT"
] | null | null | null | README.md | cyrilschumacher/Persona-API | 301622be835b95e0846d486e5e5a4303981f404f | [
"Unlicense",
"MIT"
] | null | null | null | README.md | cyrilschumacher/Persona-API | 301622be835b95e0846d486e5e5a4303981f404f | [
"Unlicense",
"MIT"
] | null | null | null | #  Persona-API
[](https://ci.appveyor.com/project/cyrilschumacher/persona-api/branch/master) for AppVeyor passing
Persona-API is A REST service that returns different information from my resume. This project is jointly used by the project: [Persona](https://github.com/cyrilschumacher/Persona/).
## Getting Started
### Software requirements
This project requires :
+ [Visual Studio 2013](http://www.visualstudio.com/)
+ [.NET Framework 4.5](http://www.microsoft.com/download/details.aspx?id=30653)
## Copyright and license
> The MIT License (MIT)
>
> Copyright (c) 2014 Cyril Schumacher.fr
>
> Permission is hereby granted, free of charge, to any person obtaining a copy
> of this software and associated documentation files (the "Software"), to deal
> in the Software without restriction, including without limitation the rights
> to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
> copies of the Software, and to permit persons to whom the Software is
> furnished to do so, subject to the following conditions:
>
> The above copyright notice and this permission notice shall be included in all
> copies or substantial portions of the Software.
>
> THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
> IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
> FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
> AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
> LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
> OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
> SOFTWARE. | 52.685714 | 198 | 0.776573 | eng_Latn | 0.407009 |
0c9c2a2b44a9ae72e20032632487d812276d6eec | 2,547 | md | Markdown | Python/py2Vs3.md | ZHKU-Robot/books | d01769fc922c1590dc6806472469b82827bbafcb | [
"MIT"
] | 228 | 2019-12-22T06:50:03.000Z | 2022-03-30T05:30:43.000Z | Python/py2Vs3.md | stvsd1314/EmbeddedSystem | 972a8378760c40ab513523bd05abffe58e367039 | [
"MIT"
] | 2 | 2021-07-08T19:09:39.000Z | 2021-07-10T20:18:58.000Z | Python/py2Vs3.md | stvsd1314/EmbeddedSystem | 972a8378760c40ab513523bd05abffe58e367039 | [
"MIT"
] | 58 | 2019-12-23T07:58:08.000Z | 2022-03-30T01:14:16.000Z | Python的3.0版本,常被称为Python 3000,或简称Py3k。相对于Python的早期版本,这是一个较大的升级。为了不带入过多的累赘,Python 3.0在设计的时候没有考虑向下相容。
许多针对早期Python版本设计的程式都无法在Python 3.0上正常执行。为了照顾现有程式,Python 2.6 2.7作为过渡版本,基本使用了Python 2.x的语法和库,同时考虑了向Python 3.0的迁移,允许使用部分Python 3.0的语法与函数。
python 2.x 和 3.x 有着许多区别,整理如下。
# print 语法
print语句没有了,取而代之的是print()函数。 Python 2.6与Python 2.7部分地支持这种形式的print语法。如果我们按照 Python 2 的方式不使用小括号调用 print 函数,Python 3 将抛出一个语法异常(SyntaxError)。
2.x
print 'Python', python_version()
print 'Hello, World!'
print('Hello, World!')
3.x
print('Python', python_version())
print('Hello, World!')
print("some text,", end="")
# 除法运算
Python中的除法较其它语言显得非常高端,有套很复杂的规则。Python中的除法有两个运算符,`/` 和 `//`
/ 除法
* 在python 2.x中 / 除法就跟我们熟悉的大多数语言,比如Java啊C啊差不多,整数相除的结果是一个整数,把小数部分完全忽略掉,浮点数除法会保留小数点的部分得到一个浮点数的结果。
* 在python 3.x中/除法不再这么做了,对于整数之间的相除,结果也会是浮点数。
Python 2.x:
>>> 1 / 2
0
>>> 1.0 / 2.0
0.5
Python 3.x:
>>> 1/2
0.5
//除法:这种除法叫做floor除法,会对除法的结果自动进行一个floor操作,在python 2.x和python 3.x中是一致的。
>>> -1 // 2
-1
# 比较操作的异常抛出
当对不可排序类型做比较时,会抛出一个类型错误(TypeError)。
2.x
>>> print None < None
False
>>> print 1 < ''
True
>>> print len <= len
True
3.x 中将导致 `TypeError: unorderable types`
# xrange
xrange 不再存在,range函数代替了以前的 xrange 函数。
同时更改的还有一系列内置函数及方法, 都返回迭代器对象, 而不是列表或者元组, 比如 filter, map, zip 等
2.x
print range(3)
print type(range(3))
# [0, 1, 2]
# <type 'list'>
3.x
print(range(3))
print(type(range(3)))
print(list(range(3)))
range(0, 3)
<class 'range'>
[0, 1, 2]
# 字符编码
Python中,不论是Python 2.x还是Python 3.x中,总体上说,字符都只有2大类:
* 通用的Unicode字符;
* Unicode被编码后的某种编码类型的字符,比如UTF-8,GBK等等类型的字符;
python 2.x 中普通的用引号括起来的字符串就是`str类型`,此时字符串的编码类型,对应着Python文件本身保存的编码方式,最常见的Windows平台中,默认用的是GBK。在普通字符串前面加上前缀u后,就表示字符串是`Unicode类型`。当然,也可以用 unicode()强制转换。
Python 3.x中,普通的用引号括起来的字符已经是Unicode类型的str了。普通字符串加上字母b作为前缀,就是表示`bytes字符串`,即为某种编码(UTF-8,GBK等)类型的字节序列。
3.x 中源码文件默认 UTF-8,因此下面语句合法:
>>> 中国 = 'china'
>>> print(中国)
china
2.x
>>> str = "测试"
>>> str
'\xe6\xb5\x8b\xe8\xaf\x95'
>>> str = u"测试"
>>> str
u'\u6d4b\u8bd5'
3.x
>>> str = "测试"
>>> str
'测试'
# 其它改动
* Python 3中,没有旧式类,只有新式类,也就是说不用再像这样 class Foobar(object): pass 显式地子类化object。
* Python 2.x中不等于有两种写法 != 和 <>,Python 3.x中去掉了<>, 只有!=一种写法。
* Py3.X去除了long类型,现在只有一种整型——int,但它的行为就像2.X版本的long
# 更多阅读
[What’s New In Python 3.0](https://docs.python.org/3/whatsnew/3.0.html)
[2to3 - Automated Python 2 to 3 code translation](https://docs.python.org/3/library/2to3.html)
| 19.007463 | 147 | 0.666274 | yue_Hant | 0.78314 |
0c9d26fb447df29a7aa57434f9452b544348e3b2 | 706 | md | Markdown | _posts/2014-11-25-create-pull-request-from-terminal.md | mipstian/genintho.github.io | 064575133d47e210df1458195eb2cd15f05ce928 | [
"MIT"
] | null | null | null | _posts/2014-11-25-create-pull-request-from-terminal.md | mipstian/genintho.github.io | 064575133d47e210df1458195eb2cd15f05ce928 | [
"MIT"
] | 4 | 2019-12-19T02:05:15.000Z | 2020-05-13T03:24:07.000Z | _posts/2014-11-25-create-pull-request-from-terminal.md | mipstian/genintho.github.io | 064575133d47e210df1458195eb2cd15f05ce928 | [
"MIT"
] | 1 | 2018-07-12T13:41:21.000Z | 2018-07-12T13:41:21.000Z | ---
layout: post
title: Create a Github Pull Request from the terminal
date: 2014-11-25 23:13:30
category: terminal
---
To simply create a Github Pull Request on the site from the current active directory use this snippet
{% highlight bash %}
function git_pull_request()
{
local NAME=`git rev-parse --abbrev-ref HEAD`
local PATH_REPO=`git remote show -n origin | grep Push | cut -d: -f2- | cut -d/ -f 1- | tr -d " " | cut -d. -f-2`
git push origin $NAME
echo $PATH_REPO
open $PATH_REPO/compare/$NAME
}
{% endhighlight %}
!!! This works only if you are using the HTTP URL.
You can probably find a more up to date version in my [dotfiles](https://github.com/genintho/dotfiles)
| 27.153846 | 118 | 0.689802 | eng_Latn | 0.905252 |
0c9e8fa2b4b36b3be771c6416c7e353eac1860f3 | 1,817 | md | Markdown | docs/docs/sample-scripts/spo/list-site-app-catalogs.md | GreyHatBeard/cli-microsoft365 | 98bfdce2cf75ffb18c410c428065a53add4ab8c3 | [
"MIT"
] | 201 | 2018-06-19T21:36:42.000Z | 2020-08-15T10:41:20.000Z | docs/docs/sample-scripts/spo/list-site-app-catalogs.md | GreyHatBeard/cli-microsoft365 | 98bfdce2cf75ffb18c410c428065a53add4ab8c3 | [
"MIT"
] | 1,290 | 2018-06-19T19:42:28.000Z | 2020-08-21T06:49:14.000Z | docs/docs/sample-scripts/spo/list-site-app-catalogs.md | GreyHatBeard/cli-microsoft365 | 98bfdce2cf75ffb18c410c428065a53add4ab8c3 | [
"MIT"
] | 166 | 2018-06-20T15:40:46.000Z | 2020-08-17T19:55:00.000Z | # Lists active SharePoint site collection application catalogs
Inspired by: [David Ramalho](http://sharepoint-tricks.com/check-all-sharepoint-sites-collection-with-app-catalog-active/)
A sample that shows how to find all installed site collection application catalogs within a tenant. IT Professionals or DevOps can benefit from it when they govern tenants or scan tenant for customizations. Pulling a list with site collection app catalogs can give them valuable information at what scale the tenant site collections are customized. The sample outputs the URL of the site collection, and this can help IT Pros or DevOps to dig deeper and find out what and how many solution packages a site collection app catalog has installed. Check for un-healthy solution packages or such that could be a security risk.
Note, because the sample uses the SharePoint search API to identify the site collection application catalogs, a newly created one might not be indexed right away. The sample output would not list the newly created app catalog until the search crawler indexes it; this usually does not take longer than a few minutes.
=== "PowerShell"
```powershell
$appCatalogs = m365 spo search --query "contentclass:STS_List_336" --selectProperties SPSiteURL --allResults --output json | ConvertFrom-Json
$appCatalogs | ForEach-Object { Write-Host $_.SPSiteURL }
Write-Host 'Total count:' $appCatalogs.Count
```
=== "Bash"
```bash
#!/bin/bash
# requires jq: https://stedolan.github.io/jq/
appCatalogs=$(m365 spo search --query "contentclass:STS_List_336" --selectProperties SPSiteURL --allResults --output json)
echo $appCatalogs | jq -r '.[].SPSiteURL'
echo "Total count:" $(echo $appCatalogs | jq length)
```
Keywords:
- SharePoint Online
- Governance
- Security | 51.914286 | 621 | 0.762245 | eng_Latn | 0.970199 |
0c9eb7fdc9b0b861838f5705e506ba847e31f5b3 | 887 | md | Markdown | pages/en/lb4/apidocs/context.invocationcontext.assertmethodexists.md | Neverage/loopback.io | 10f0049f1de83dfb866c05ce06c7a0b9954da331 | [
"MIT"
] | 2 | 2021-11-05T10:12:18.000Z | 2021-12-10T09:27:57.000Z | pages/en/lb4/apidocs/context.invocationcontext.assertmethodexists.md | Neverage/loopback.io | 10f0049f1de83dfb866c05ce06c7a0b9954da331 | [
"MIT"
] | null | null | null | pages/en/lb4/apidocs/context.invocationcontext.assertmethodexists.md | Neverage/loopback.io | 10f0049f1de83dfb866c05ce06c7a0b9954da331 | [
"MIT"
] | null | null | null | ---
lang: en
title: 'API docs: context.invocationcontext.assertmethodexists'
keywords: LoopBack 4.0, LoopBack 4, Node.js, TypeScript, OpenAPI
sidebar: lb4_sidebar
editurl: https://github.com/strongloop/loopback-next/tree/master/packages/context
permalink: /doc/en/lb4/apidocs.context.invocationcontext.assertmethodexists.html
---
<!-- Do not edit this file. It is automatically generated by API Documenter. -->
[Home](./index.md) > [@loopback/context](./context.md) > [InvocationContext](./context.invocationcontext.md) > [assertMethodExists](./context.invocationcontext.assertmethodexists.md)
## InvocationContext.assertMethodExists() method
Assert the method exists on the target. An error will be thrown if otherwise.
<b>Signature:</b>
```typescript
assertMethodExists(): Record<string, Function>;
```
<b>Returns:</b>
Record<string, Function>
| 31.678571 | 192 | 0.749718 | yue_Hant | 0.319805 |
0c9efc999da45aab92b01cb418d0e1b6ca55f3d5 | 2,180 | md | Markdown | wdk-ddi-src/content/ntddtape/ns-ntddtape-_tape_set_media_parameters.md | aktsuda/windows-driver-docs-ddi | a7b832e82cc99f77dbde72349c0a61670d8765d3 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | wdk-ddi-src/content/ntddtape/ns-ntddtape-_tape_set_media_parameters.md | aktsuda/windows-driver-docs-ddi | a7b832e82cc99f77dbde72349c0a61670d8765d3 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | wdk-ddi-src/content/ntddtape/ns-ntddtape-_tape_set_media_parameters.md | aktsuda/windows-driver-docs-ddi | a7b832e82cc99f77dbde72349c0a61670d8765d3 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
UID: NS:ntddtape._TAPE_SET_MEDIA_PARAMETERS
title: "_TAPE_SET_MEDIA_PARAMETERS"
author: windows-driver-content
description: The TAPE_SET_MEDIA_PARAMETERS structure is used in conjunction with the IOCTL_TAPE_SET_MEDIA_PARAMS request to reset the block size of the media in a tape drive.
old-location: storage\tape_set_media_parameters.htm
old-project: storage
ms.assetid: f038eb24-71d2-414c-ad7c-06cb1fa24070
ms.author: windowsdriverdev
ms.date: 3/29/2018
ms.keywords: "*PTAPE_SET_MEDIA_PARAMETERS, PTAPE_SET_MEDIA_PARAMETERS, PTAPE_SET_MEDIA_PARAMETERS structure pointer [Storage Devices], TAPE_SET_MEDIA_PARAMETERS, TAPE_SET_MEDIA_PARAMETERS structure [Storage Devices], _TAPE_SET_MEDIA_PARAMETERS, ntddtape/PTAPE_SET_MEDIA_PARAMETERS, ntddtape/TAPE_SET_MEDIA_PARAMETERS, storage.tape_set_media_parameters, structs-tape_83d386fe-a430-4c8f-af97-2f6c7ecc4b67.xml"
ms.prod: windows-hardware
ms.technology: windows-devices
ms.topic: struct
req.header: ntddtape.h
req.include-header: Ntddtape.h, Minitape.h
req.target-type: Windows
req.target-min-winverclnt:
req.target-min-winversvr:
req.kmdf-ver:
req.umdf-ver:
req.ddi-compliance:
req.unicode-ansi:
req.idl:
req.max-support:
req.namespace:
req.assembly:
req.type-library:
req.lib:
req.dll:
req.irql:
topic_type:
- APIRef
- kbSyntax
api_type:
- HeaderDef
api_location:
- ntddtape.h
api_name:
- TAPE_SET_MEDIA_PARAMETERS
product:
- Windows
targetos: Windows
req.typenames: TAPE_SET_MEDIA_PARAMETERS, *PTAPE_SET_MEDIA_PARAMETERS
---
# _TAPE_SET_MEDIA_PARAMETERS structure
## -description
The TAPE_SET_MEDIA_PARAMETERS structure is used in conjunction with the <a href="https://msdn.microsoft.com/library/windows/hardware/ff560636">IOCTL_TAPE_SET_MEDIA_PARAMS</a> request to reset the block size of the media in a tape drive.
## -struct-fields
### -field BlockSize
Indicates the requested block size, in bytes, or zero for variable block size in a drive that supports it.
## -see-also
<a href="https://msdn.microsoft.com/library/windows/hardware/ff560636">IOCTL_TAPE_SET_MEDIA_PARAMS</a>
<a href="https://msdn.microsoft.com/library/windows/hardware/ff567953">TapeMiniSetMediaParameters</a>
| 27.25 | 407 | 0.808257 | yue_Hant | 0.659157 |
0c9efdd25f8194ba9ac90bba4e864252f5628a12 | 148 | md | Markdown | _pages/dates/2020-10.md | m-lab/website | 57362ee18cffa9c38a2216077b0ffe866cdde0c4 | [
"Apache-2.0"
] | 9 | 2019-06-13T06:00:38.000Z | 2021-10-05T02:50:53.000Z | _pages/dates/2020-10.md | m-lab/m-lab.github.io | bb4fbbb18c9237e09d6c8e6988b5d40ea3a5c521 | [
"Apache-2.0"
] | 375 | 2016-01-25T17:26:42.000Z | 2019-01-24T21:38:22.000Z | _pages/dates/2020-10.md | m-lab/m-lab.github.io | bb4fbbb18c9237e09d6c8e6988b5d40ea3a5c521 | [
"Apache-2.0"
] | 30 | 2015-12-28T20:04:19.000Z | 2018-12-19T20:47:32.000Z | ---
layout: blog-date-archive
title: "October 2020"
permalink: /blog/2020/10/
archive-name: October 2020
archive-type: Monthly
breadcrumb: blog
---
| 16.444444 | 26 | 0.743243 | kor_Hang | 0.237007 |
0c9ffe0e841ea40d71b0cdcc6534c021ffde8e68 | 173 | md | Markdown | samples/dotnet/README.md | absltkaos/indy-sdk | bc14c5b514dc1c76ce62dd7f6bf804120bf69f5e | [
"Apache-2.0"
] | 636 | 2017-05-25T07:45:43.000Z | 2022-03-23T22:30:34.000Z | samples/dotnet/README.md | Nick-1979/indy-sdk | e5f812e14962f0d51cf96f843033754ff841ce30 | [
"Apache-2.0"
] | 731 | 2017-05-29T07:15:08.000Z | 2022-03-31T07:55:58.000Z | samples/dotnet/README.md | Nick-1979/indy-sdk | e5f812e14962f0d51cf96f843033754ff841ce30 | [
"Apache-2.0"
] | 904 | 2017-05-25T07:45:49.000Z | 2022-03-31T07:43:31.000Z | ## Hyperledger Indy Samples using the .NET SDK Wrapper
The solution in this directory provides sample code that can be used to gain an insight into how the API can be used. | 57.666667 | 117 | 0.791908 | eng_Latn | 0.999828 |
0ca0d7dde7825c310e898f7ff64f08927e8a1736 | 3,609 | md | Markdown | string/271.-encode-and-decode-strings.md | wdyfy/My-summary | 0bb76fd3193479a11b66781659002948700ce512 | [
"MIT"
] | 1 | 2019-05-02T22:24:30.000Z | 2019-05-02T22:24:30.000Z | string/271.-encode-and-decode-strings.md | wdyfy/My-summary | 0bb76fd3193479a11b66781659002948700ce512 | [
"MIT"
] | null | null | null | string/271.-encode-and-decode-strings.md | wdyfy/My-summary | 0bb76fd3193479a11b66781659002948700ce512 | [
"MIT"
] | null | null | null | # 271. Encode and Decode Strings
Design an algorithm to encode **a list of strings** to **a string**. The encoded string is then sent over the network and is decoded back to the original list of strings.
Machine 1 \(sender\) has the function:
```text
string encode(vector<string> strs) {
// ... your code
return encoded_string;
}
```
Machine 2 \(receiver\) has the function:
```text
vector<string> decode(string s) {
//... your code
return strs;
}
```
So Machine 1 does:
```text
string encoded_string = encode(strs);
```
and Machine 2 does:
```text
vector<string> strs2 = decode(encoded_string);
```
`strs2` in Machine 2 should be the same as `strs` in Machine 1.
Implement the `encode` and `decode` methods.
**Note:**
* The string may contain any possible characters out of 256 valid ascii characters. Your algorithm should be generalized enough to work on any possible characters.
* Do not use class member/global/static variables to store states. Your encode and decode algorithms should be stateless.
* Do not rely on any library method such as `eval` or serialize methods. You should implement your own encode/decode algorithm.
方法一:Bencode
这种编码方式是:长度+‘/‘+字符串
| `int` | [`indexOf`](https://docs.oracle.com/javase/9/docs/api/java/lang/String.html#indexOf-int-int-)`(int ch, int fromIndex)` | Returns the index within this string of the first occurrence of the specified character, starting the search at the specified index. |
| :--- | :--- | :--- |
```text
public class Codec {
// Encodes a list of strings to a single string.
public String encode(List<String> strs) {
StringBuilder sb = new StringBuilder();
for (String str : strs) {
sb.append(str.length()).append('/').append(str);
}
return sb.toString();
}
// Decodes a single string to a list of strings.
public List<String> decode(String s) {
List<String> res = new ArrayList<>();
int i = 0;
while (i < s.length()) {
int slash = s.indexOf('/', i);
int size = Integer.valueOf(s.substring(i, slash));
i = slash + size + 1;
res.add(s.substring(slash + 1, i));
}
return res;
}
}
// Your Codec object will be instantiated and called as such:
// Codec codec = new Codec();
// codec.decode(codec.encode(strs));
```
方法二:escaping character
| [`String`](https://docs.oracle.com/javase/9/docs/api/java/lang/String.html)`[]` | [`split`](https://docs.oracle.com/javase/9/docs/api/java/lang/String.html#split-java.lang.String-int-)`(`[`String`](https://docs.oracle.com/javase/9/docs/api/java/lang/String.html) `regex, int limit)` | Splits this string around matches of the given [regular expression](https://docs.oracle.com/javase/9/docs/api/java/util/regex/Pattern.html#sum). |
| :--- | :--- | :--- |
```text
public class Codec {
// Encodes a list of strings to a single string.
public String encode(List<String> strs) {
StringBuilder sb = new StringBuilder();
for (String s : strs) {
sb.append(s.replace("#", "##")).append(" # ");
}
return sb.toString();
}
// Decodes a single string to a list of strings.
public List<String> decode(String s) {
List<String> str = new ArrayList<>();
String[] res = s.split(" # ", -1);
for (int i = 0; i< res.length - 1; i++) {
str.add(res[i].replace("##", "#"));
}
return str;
}
}
// Your Codec object will be instantiated and called as such:
// Codec codec = new Codec();
// codec.decode(codec.encode(strs));
```
| 30.584746 | 434 | 0.636464 | eng_Latn | 0.8517 |
0ca150f4fda390aca539689fda1d01dde5fec69a | 3,235 | md | Markdown | docs/mindspore/note/source_zh_cn/glossary.md | bwcsswcx/docs | e54b179bb8ca020a9bf0c83926822048057e9536 | [
"Apache-2.0",
"CC-BY-4.0"
] | null | null | null | docs/mindspore/note/source_zh_cn/glossary.md | bwcsswcx/docs | e54b179bb8ca020a9bf0c83926822048057e9536 | [
"Apache-2.0",
"CC-BY-4.0"
] | null | null | null | docs/mindspore/note/source_zh_cn/glossary.md | bwcsswcx/docs | e54b179bb8ca020a9bf0c83926822048057e9536 | [
"Apache-2.0",
"CC-BY-4.0"
] | null | null | null | # 术语
`Linux` `Windows` `Ascend` `GPU` `CPU` `全流程` `初级` `中级` `高级`
<a href="https://gitee.com/mindspore/docs/blob/master/docs/mindspore/note/source_zh_cn/glossary.md" target="_blank"><img src="https://gitee.com/mindspore/docs/raw/master/resource/_static/logo_source.png"></a>
| 术语/缩略语 | 说明 |
| ----- | ----- |
| ACL | Ascend Computer Language,提供Device管理、Context管理、Stream管理、内存管理、模型加载与执行、算子加载与执行、媒体数据处理等C++ API库,供用户开发深度神经网络应用。|
| AIR | Ascend Intermediate Representation,类似ONNX,是华为定义的针对机器学习所设计的开放式的文件格式,能更好地适配Ascend AI处理器。 |
| Ascend | 华为昇腾系列芯片的系列名称。 |
| CCE | Cube-based Computing Engine,面向硬件架构编程的算子开发工具。 |
| CCE-C | Cube-based Computing Engine C,使用CCE开发的C代码。 |
| CheckPoint | MindSpore模型训练检查点,保存模型的参数,可以用于保存模型供推理,或者再训练。 |
| CIFAR-10 | 一个开源的图像数据集,包含10个类别的60000个32x32彩色图像,每个类别6000个图像。有50000张训练图像和10000张测试图像。 |
| CIFAR-100 | 一个开源的图像数据集,它有100个类别,每个类别包含500张训练图像和100张测试图像。 |
| Davinci | 达芬奇架构,华为自研的新型芯片架构。 |
| EulerOS | 欧拉操作系统,华为自研的基于Linux标准内核的操作系统。 |
| FC Layer | Fully Conneted Layer,全连接层。整个卷积神经网络中起到分类器的作用。 |
| FE | Fusion Engine,负责对接GE和TBE算子,具备算子信息库的加载与管理、融合规则管理等能力。 |
| Fine-tuning | 基于面向某任务训练的网络模型,训练面向第二个类似任务的网络模型。 |
| FP16 | 16位浮点,半精度浮点算术,消耗更小内存。 |
| FP32 | 32位浮点,单精度浮点算术。 |
| GE | Graph Engine,MindSpore计算图执行引擎,主要负责根据前端的计算图完成硬件相关的优化(算子融合、内存复用等等)、device侧任务启动。 |
| GHLO | Graph High Level Optimization,计算图高级别优化。GHLO包含硬件无关的优化(如死代码消除等)、自动并行和自动微分等功能。 |
| GLLO | Graph Low Level Optimization,计算图低级别优化。GLLO包含硬件相关的优化,以及算子融合、Buffer融合等软硬件结合相关的深度优化。 |
| Graph Mode | MindSpore的静态图模式,将神经网络模型编译成一整张图,然后下发执行,性能高。 |
| HCCL | Huawei Collective Communication Library,实现了基于Davinci架构芯片的多机多卡通信。 |
| ImageNet | 根据WordNet层次结构(目前仅名词)组织的图像数据库。 |
| LeNet | 一个经典的卷积神经网络架构,由Yann LeCun等人提出。 |
| Loss | 损失,预测值与实际值的偏差,深度学习用于判断模型好坏的一个标准。 |
| LSTM | Long short-term memory,长短期记忆,对应的网络是一种时间循环神经网络,适合于处理和预测时间序列中间隔和延迟非常长的重要事件。 |
| Manifest | 一种数据格式文件,华为ModelArts采用了该格式,详细说明请参见<https://support.huaweicloud.com/engineers-modelarts/modelarts_23_0009.html>。 |
| ME | Mind Expression,MindSpore前端,主要完成从用户源码到计算图的编译任务、训练中控制执行及上下文维护(非下沉模式配置下)、动态图(PyNative模式)等。 |
| MindArmour | MindSpore安全模块,通过差分隐私、对抗性攻防等技术手段,提升模型的保密性、完整性和可用性,阻止攻击者对模型进行恶意修改或是破解模型的内部构件,窃取模型的参数。 |
| MindData | MindSpore数据框架,提供数据加载、增强、数据集管理以及可视化。 |
| MindInsight | MindSpore可视化组件,可视化标量、图像、计算图以及模型超参等信息。 |
| MindIR | MindSpore IR,一种基于图表示的函数式IR,定义了可扩展的图结构以及算子IR表示,存储了MindSpore基础数据结构。 |
| MindRecord | MindSpore定义的一种数据格式,是一个执行读取、写入、搜索和转换MindSpore格式数据集的模块。 |
| MindSpore | 华为主导开源的深度学习框架。 |
| MindSpore Lite | 一个轻量级的深度神经网络推理引擎,提供了将MindSpore训练出的模型在端侧进行推理的功能。 |
| MNIST database | Modified National Institute of Standards and Technology database,一个大型手写数字数据库,通常用于训练各种图像处理系统。 |
| ONNX | Open Neural Network Exchange,是一种针对机器学习所设计的开放式的文件格式,用于存储训练好的模型。|
| PyNative Mode | MindSpore的动态图模式,将神经网络中的各个算子逐一下发执行,方便用户编写和调试神经网络模型。 |
| ResNet-50 | Residual Neural Network 50,由微软研究院的Kaiming He等四名华人提出的残差神经网络。 |
| Schema | 数据集结构定义文件,用于定义数据集包含哪些字段以及字段的类型。 |
| Summary | 是对网络中Tensor取值进行监测的一种算子,在图中是“外围”操作,不影响数据流本身。 |
| TBE | Tensor Boost Engine,华为自研的NPU算子开发工具,在TVM( Tensor Virtual Machine )框架基础上扩展,提供了一套Python API来实施开发活动,进行自定义算子开发。 |
| TFRecord | Tensorflow定义的数据格式。 |
| 64.7 | 208 | 0.746213 | yue_Hant | 0.724972 |
0ca1b87501913bb7849b01cd2ef480ab869a4788 | 304 | md | Markdown | README.md | samcv/language-xcompose | 220ec42caae041405c828463185be1de6b41c53f | [
"MIT"
] | null | null | null | README.md | samcv/language-xcompose | 220ec42caae041405c828463185be1de6b41c53f | [
"MIT"
] | null | null | null | README.md | samcv/language-xcompose | 220ec42caae041405c828463185be1de6b41c53f | [
"MIT"
] | null | null | null | # XCompose file syntax highlighting for Atom
Highlight your XCompose files. Are you like me, and love your Unicode characters?
Highlight those boring XCompose files!
![A screenshot][screenshot-1]
[screenshot-1]: https://raw.githubusercontent.com/samcv/language-xcompose/master/images/screenshot-1.png
| 33.777778 | 104 | 0.802632 | eng_Latn | 0.885164 |
0ca1ccbf94a9a08334726298aa3e83def62472db | 13,737 | md | Markdown | docs/boards/backlogs/office/create-your-backlog-tasks-using-project.md | 035/vsts-docs | c9cc01e6b9b2d2f53599c85ef03a96bd9751f8e6 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2020-05-28T15:54:28.000Z | 2020-05-28T15:54:28.000Z | docs/boards/backlogs/office/create-your-backlog-tasks-using-project.md | 035/vsts-docs | c9cc01e6b9b2d2f53599c85ef03a96bd9751f8e6 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/boards/backlogs/office/create-your-backlog-tasks-using-project.md | 035/vsts-docs | c9cc01e6b9b2d2f53599c85ef03a96bd9751f8e6 | [
"CC-BY-4.0",
"MIT"
] | 2 | 2022-02-11T00:59:35.000Z | 2022-02-11T01:02:20.000Z | ---
title: Create your backlog and tasks using Project
titleSuffix: Azure Boards and TFS
description: Add items, plan, order, and estimate your backlog of deliverables in Azure Boards or Team Foundation Server
ms.technology: devops-agile
ms.prod: devops
ms.assetid: be5cef4f-755f-4ffe-8dd7-876d1e02c330
ms.manager: douge
ms.author: kaelli
author: KathrynEE
ms.topic: conceptual
ms.date: 07/21/2017
---
# Create your backlog and tasks using Project
[!INCLUDE [temp](../../_shared/version-vsts-tfs-all-versions.md)]
If Office Project is your preferred tool for tacking projects, you can use it to create your backlog, schedule tasks, assign resources, and track work that is also tracked in Azure Boards or Team Foundation Server (TFS). You can use Project while your development team uses the tools they prefer, all while sharing information transparently.
Working in Project is similar to publishing and refreshing work items using [Office Excel](bulk-add-modify-work-items-excel.md), with a few differences as described [later in this topic](#differences).
Use this topic to learn how to:
>[!div class="checklist"]
> * Connect a Project plan to a project
> * Add tasks to Project and publish them as work items
> * Indent tasks to create parent-child links
> * Link tasks to create predecessor-successor links
> * View how Project columns map to work item fields
> [!NOTE]
>You can also manage projects using Project Professional and [Project Server synchronized with TFS](../../../reference/tfs-ps-sync/synchronize-tfs-project-server.md) , but you can't use Project Professional to both publish and refresh to TFS and synchronize with TFS.
## Add tasks and publish work items
1. If you don't have Office Project 2007 or a more recent version, [install it](https://products.office.com/project). For Azure Boards and TFS 2017 and later versions, you'll need Project 2010 or later version.
> [!NOTE]
>You can't use Office Project 365 to connect to Azure Boards and TFS.
2. If you haven't installed a version of [Visual Studio (2010 or later)](https://visualstudio.microsoft.com/downloads/download-visual-studio-vs) or the [Team Foundation Server Standalone Office Integration 2015 (free)](https://visualstudio.microsoft.com/downloads/#team-foundation-server-office-integration-2015-update-3-1), you'll need to install one of these versions to connect to an Azure Boards or TFS project.
> [!NOTE]
>The only way to get the Team Foundation plug-in is by installing one of the latest editions of Visual Studio or the TFS Standalone Office Integration installer. TFS Office Integration 2015 supports connection to Azure Boards and TFS from Excel, Project, and the PowerPoint-based storyboarding tool.
3. In Project, start with a blank worksheet. If you don't see the **Team** ribbon (or the **Team** menu if you use Project 2007) see step 2 or [TFS-Office integration issues](tfs-office-integration-issues.md).
> [!TIP]
>If you want to first import a list or tree of work items you've already defined, follow steps 3 and 4 under [Bulk add or modify work items with Excel, Add work items](bulk-add-modify-work-items-excel.md#add-work-items). In the New list dialog, select the **Query** that contains the work items you want to import.

Another way to start is to open a backlog query in Team Explorer and from the context menu, choose **Open Query in Microsoft Project**.
>**Tip:** If the **Team** ribbon no longer appears, you might need to [re-enable it](https://msdn.microsoft.com/library/vstudio/ms268871.aspx).
4. Connect to TFS and the project that you want to plan. If you can't connect, [get added as a team member](../../../organizations/settings/add-teams.md).

If it's your first time connecting to TFS from Project, you might have to add the name of your TFS to the list of recognized servers.

Project is now bound to your project. The Team Foundation Gantt view supports entry and display of several TFS fields.

5. Add task information and then publish the project. To add a work item, specify the **Title**, **Work Item Type**, **Publish and Refresh**, and any other required fields. Resize and move columns using standard [Project methods](http://office.microsoft.com/client/helppreview14.aspx?AssetId=HP010351693&lcid=1033&NS=WINPROJ&Version=14&tl=2&pid=CH010359308&CTT=4).
>**Tip:** Set the **Publish and Refresh** field for a task to **Yes** if you want to have a work item created for it in TFS. For example, set user stories, backlog items, and tasks to be published and refreshed. However, any summary tasks that you create to group tasks or to assign milestones, set **Publish and Refresh** to **No**.

Notice how IDs are now assigned to your work items.

Optionally, you can use , select a work item query, and add work items from TFS to your project plan.
6. Assign resources to tasks. Or, leave that field blank for the development team to assign.

>**Tip:** Although Project supports allocation of more than one resource to a task, TFS does not. If a task requires more than one resource to complete, divide the task into subtasks and assign one resource to each subtask. Only assign a TFS team member as a resource to those tasks that you will publish.
>
>Specify resources by their display names from Active Directory Domain Services (AD DS). If you assign a resource by its alias or other name, you risk incurring validation errors.
7. Save your project plan to retain scheduling and other data that TFS doesn't store.
## Indent tasks to create parent-child links
When you indent tasks and then publish your plan, you create parent-child links between work items. Tasks will show up on the [task board](../../sprints/task-board.md) when they are assigned to the current sprint.

To see the parent-child links that you just created, open **Links and Attachments**.

## Link tasks to create predecessor-successor links
When you link two tasks and publish your plan, TFS creates predecessor-successor links between the two work items.

Although the work tracking system tracks predecessor-successor dependencies as work item links, it does not track dependency types, lead and lag time, or other constraints that Project does.
## Specify data for other work tracking fields
To enter data into other work tracking fields, switch to the Team Foundation Task Sheet.

This view displays all the work tracking fields that have been mapped to Project.

Optionally, you can add a mapped work tracking field as a column to the Team Foundation Gantt view. To see which work tracking fields are mapped, open **Column Mappings**.

To add more work tracking fields or change the way fields are mapped, see [Customize the Microsoft Project field mapping file](https://msdn.microsoft.com/library/ms404686.aspx). This feature is available for both Azure Boards and TFS.
## Tips for working in Project and other Team Foundation clients
You can manage your project plan using Project and all the features that Project provides. Because you and other team members can modify TFS work items from the web portal, Excel, Project, and Team Explorer, follow these tips to manage your work effectively:
<table>
<tbody>
<tr>
<td><ul>
<li><p>When you first open a project plan, use <img src="_img/create-your-backlog-tasks-using-project/IC652594.png" title="Refresh icon in Excel on Team ribbon" alt="Refresh icon in Excel on Team ribbon" /> (<strong>Refresh</strong>) to download the latest data from TFS.</p></li>
<li><p>Publish your changes and refresh your plan periodically while you work. Otherwise, you can encounter data conflicts between plan data and the TFS data store.</p></li>
<li><p>Save your project plan to maintain scheduling data and other information that TFS doesn't store.</p></li>
<li><p>When defining areas and iterations, keep in mind that Project and Excel restrict the length of the area and iteration path field to 256 characters.</p></li>
<li><p>In Project 2010 and later versions, when you choose <img src="_img/create-your-backlog-tasks-using-project/IC413649.png" title="Pinned task icon" alt="Pinned task icon" /> (Manually scheduled tasks), team members can place a manually scheduled task anywhere in their schedules, and Project will not move it. In order for team members to manually schedule their tasks, you will have to add the necessary project fields to TFS task definitions.</p>
<p>Start and finish dates for autoscheduled tasks (<img src="_img/create-your-backlog-tasks-using-project/IC413651.png" title="Auto Update Task Mode icon" alt="Auto Update Task Mode icon" />) are determined by the scheduling engine based on task dependencies and the project calendar, as in previous releases of Project.</p></li>
</ul></td>
<td><ul>
<li><p>Use Project to manage and update changes to these fields:</p>
<ul>
<li><p>Finish Date</p></li>
<li><p>Start Date</p></li>
<li><p>Calculated fields for completed and remaining work</p></li>
</ul>
<p>Although TFS can store estimated, completed, and remaining work, and start and finish dates, TFS does not recalculate the fields when updates to these fields are made.</p>
<p>When you publish to TFS, start and finish times are read-only in TFS by default. Project does not download start and finish times when you refresh the plan. </p></li>
<li><p>If you see that hours are counted twice in reports that contain task hours, <a href="https://msdn.microsoft.com/library/dd997572">correct the problem</a>.</p>
<p>Project assigns parent tasks the rollup of hours that are assigned to all its child tasks. Rollup hours are not published to TFS to prevent hours within reports from being counted twice. The Microsoft Project mapping file attribute, IfSummaryRefreshOnly, suppresses the hours that are assigned to summary tasks.</p></li>
</ul></td>
</tr>
</tbody>
</table>
<a id="differences" />
## Differences working in Project versus Excel
|Area|Project|Excel|
|---|---|---|
|Adding TFS fields|You can only add fields to your Project plan that are defined in the Microsoft Project mapping file.|You can add any TFS field to your Excel worksheet that is listed in the Choose Columns dialog, subject to Excel limitations on text length.|
|Publish/Refresh|You specify the **Publish or Refresh** field for individual tasks. Also, field attributes defined in the Microsoft Project mapping file affect how fields are published and refreshed.|All work items are subject to publish and refresh.|
|Linking|You can create and modify parent-child links or predecessor-successor links between work items.|Using the tree list view, you can create and modify parent-child links.|
## Related articles
- [Bulk modify work items using Excel](bulk-add-modify-work-items-excel.md)
- [Create your backlog](../../backlogs/create-your-backlog.md)
- [Requirements and compatibility](/tfs/server/requirements)
If the Team ribbon fails to appear, see [TFS-Office integration issues](tfs-office-integration-issues.md).
### Delete work items
You can't delete work items from Excel. The only way to delete work items is from the web portal or the **witadmin** command line tool. For details, see [Move, change, or delete work items ](../../backlogs/remove-delete-work-items.md).
### Do you want to add Project fields to TFS work items?
For team members to be able to view or modify Project fields from a Team Foundation client, you must customize both the definition file for the task work item type and update the Microsoft Project Mapping file. For resources, see [Schedule projects using Microsoft Project 2010](https://msdn.microsoft.com/library/ff731586.aspx).
### Do you want to map additional TFS fields to Project, or change how fields are mapped?
You can change how Team Foundation fields map to fields in Microsoft Project, and you can change how specific fields are published. See [The Microsoft Project Field Mapping File](https://msdn.microsoft.com/library/ms404686.aspx).
### Project for Mac
macOS is not supported. You need to use Project on the same computer where you have installed Visual Studio or the Team Foundation Server Standalone Office Integration 2015 in order to get the Team Foundation add-in. These applications require Windows.
[!INCLUDE [temp](../../../_shared/help-support-shared.md)]
| 71.176166 | 453 | 0.766907 | eng_Latn | 0.994705 |
0ca2301cf5e550e93fc68d355507f1fb9d66f5bb | 815 | md | Markdown | _includes/about/zh.md | nicktcl/nicktcl1.github.io | 5e946e7f6c630232f11dc2fa18e154a1483ddeaf | [
"Apache-2.0"
] | 4 | 2019-02-14T05:46:13.000Z | 2019-11-08T14:57:56.000Z | _includes/about/zh.md | nicktcl/nicktcl1.github.io | 5e946e7f6c630232f11dc2fa18e154a1483ddeaf | [
"Apache-2.0"
] | 1 | 2020-04-27T07:07:56.000Z | 2020-05-06T01:49:26.000Z | _includes/about/zh.md | nicktcl/nicktcl1.github.io | 5e946e7f6c630232f11dc2fa18e154a1483ddeaf | [
"Apache-2.0"
] | 1 | 2019-09-25T12:22:14.000Z | 2019-09-25T12:22:14.000Z | > Keep hungry, Keep foolish.
> 求知若饥,虚心若愚。
互联网那么大,能在这相逢就是缘。
我是唐传林,目前在读电子与通信方向的硕士研究生,对电子电路、嵌入式系统开发、开源路由器系统、软件无线电SDR、音频处理、数据处理与建模等领域感兴趣。
有中度“松鼠囤积癖”,从小喜欢从互联网收集各种实用高效的软件。
平时喜欢整理个人博客,逛逛知乎和技术论坛,打打羽毛球。
> 关于本站
本站原主题,由[Hux](https://github.com/Huxpro/huxpro.github.io)和[一之笔](https://github.com/yizibi/yizibi.github.io)提供,感谢黄玄大神和一之笔。
本站源码可以在[这里](https://github.com/nicktcl/nicktcl.github.io)获取,如果你喜欢,欢迎 [clone]((https://github.com/nicktcl/nicktcl.github.io)),打造属于自己的个性博客;
如果你觉得本站不错,欢迎 [star](https://github.com/nicktcl/nicktcl.github.io),你的 [star](https://github.com/nicktcl/nicktcl.github.io) 是我不断进步,不断坚持的动力,如果你正好也有兴趣拥有自己的个人博客,我们也可以交换一下友链;
> 友情链接
关于友情链接,完全是为了,互相督促自己,如果想和我`互友`,欢迎在[issue](https://github.com/nicktcl/nicktcl.github.io/issues)留言你的链接,注明,title;
另外,既然是友链,我也希望,我们能够及时更新博客,不要只有3分钟热度,得过且过,这样是适合互友链的,谢谢。
| 28.103448 | 168 | 0.777914 | yue_Hant | 0.55544 |
0ca2558759ed8f608695d5708fd75025d80baad2 | 4,892 | md | Markdown | doc/RESPONSE.md | woguava/fluwx | 0398164a9e8644b110b7ae55b70b256c975dfd7a | [
"Apache-2.0"
] | 1 | 2019-03-18T09:29:05.000Z | 2019-03-18T09:29:05.000Z | doc/RESPONSE.md | yecao007/fluwx | f45ef1518966ae4d8c1da36d972b0855116c1e70 | [
"Apache-2.0"
] | null | null | null | doc/RESPONSE.md | yecao007/fluwx | f45ef1518966ae4d8c1da36d972b0855116c1e70 | [
"Apache-2.0"
] | null | null | null | ### Response From WeChat
There's some work we have to do on the particular platform(if you don't need this,just ignore).
### Android
Fluwx will create `WXEntryActivity`or`WXPayEntryActivity` by itself since *0.4.0*. So the following
code isn't necessary.
~~For`Android`,create `WXEntryActivity`or`WXPayEntryActivity`,and override the following function:~~
```kotlin
public override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
WXAPiHandler.wxApi?.handleIntent(intent, this)
}
override fun onNewIntent(intent: Intent) {
super.onNewIntent(intent)
setIntent(intent)
WXAPiHandler.wxApi?.handleIntent(intent, this)
}
override fun onResp(resp: BaseResp) {
FluwxResponseHandler.handleResponse(resp)
finish()
}
```
~~You can also directly inherit `FluwxWXEntryActivity`,and then you can do nothing.
For the rule of creating `WXEntryActivity` and `WXPayEntryActivity`,please read [example wxapi](https://github.com/OpenFlutter/fluwx/tree/master/example/android/app/src/main/kotlin/net/sourceforge/simcpux/wxapi )~~
~~,never forget to register your Activity in `AndroidManifest.mxl`:~~
```xml
<activity
android:name="your.package.name.registered.on.wechat.wxapi.WXEntryActivity"
android:theme="@style/DisablePreviewTheme"
android:exported="true"
android:launchMode="singleTop"/>
<activity
android:name="your.package.name.registered.on.wechat.wxapi.WXPayEntryActivity"
android:theme="@style/DisablePreviewTheme"
android:exported="true"
android:launchMode="singleTop"/>
```
#### However Customization Is Always Good
Well, sometimes you need to create `WXEntryActivity`and`WXPayEntryActivity` by yourself because your project isn't
a pure-flutter-project. The `WXEntryActivity`and`WXPayEntryActivity` must be under *packageName/wxapi/*,you
can inherit `FluwxWXEntryActivity` for convenience.Then register `WXEntryActivity`and`WXPayEntryActivity`in
`AndroidManifest.mxl`:
```
<activity android:name=".wxapi.WXEntryActivity"
android:theme="@style/DisablePreviewTheme"
/>
<activity android:name=".wxapi.WXPayEntryActivity"
android:theme="@style/DisablePreviewTheme"/>
<activity-alias
android:name="${applicationId}.wxapi.WXEntryActivity"
android:exported="true"
tools:replace="android:targetActivity"
android:targetActivity=".wxapi.WXEntryActivity"
android:launchMode="singleTop">
<intent-filter>
<action android:name="android.intent.action.VIEW" />
<category android:name="android.intent.category.DEFAULT" />
<data android:scheme="sdksample" />
</intent-filter>
</activity-alias>
<activity-alias
tools:replace="android:targetActivity"
android:name="${applicationId}.wxapi.WXPayEntryActivity"
android:exported="true"
android:targetActivity=".wxapi.WXPayEntryActivity"
android:launchMode="singleTop">
<intent-filter>
<action android:name="android.intent.action.VIEW" />
<category android:name="android.intent.category.DEFAULT" />
<data android:scheme="sdksample" />
</intent-filter>
</activity-alias>
```
### iOS
override the following function in`AppDelegate`:
```objective-c
- (BOOL)application:(UIApplication *)application openURL:(NSURL *)url sourceApplication:(NSString *)sourceApplication annotation:(id)annotation {
return [WXApi handleOpenURL:url delegate:[FluwxResponseHandler defaultManager]];
}
// NOTE: 9.0以后使用新API接口
- (BOOL)application:(UIApplication *)app openURL:(NSURL *)url options:(NSDictionary<NSString*, id> *)options
{
return [WXApi handleOpenURL:url delegate:[FluwxResponseHandler defaultManager]];
}
```
> NOTE:Don't forget to add URL Schema in order to go back to your app.
### Flutter
We can get the reponse from WeChat after sharing and etc:
```dart
fluwx.responseFromShare.listen((response){
//do something
});
fluwx.responseFromAuth.listen((response){
//do something
});
fluwx.responseFromPayment.listen((response){
//do something
});
```
> NOTE:If the field starts with "android" or "iOS", it means that only android or iOS has the field.
The type of return value is `WeChatResponse`,and `type` is an enum:
```dart
enum WeChatResponseType {
SHARE,
AUTH,
PAYMENT }
```
`result` is the real response from WeChat,it's a `Map`, read the WeChat documents for more details.
Howver,there an addtional param `platform`,the value of `platform` is `android`or`iOS`.
| 38.825397 | 214 | 0.674571 | eng_Latn | 0.366887 |
0ca397b8e13604e2e3d7fb8f0931e15947e2e175 | 2,581 | md | Markdown | pocketpc/README.md | redarrowlabs/pubnub-c-sharp | 9d82a7a902080d65dac508328ff156b23b6cb3de | [
"MIT"
] | null | null | null | pocketpc/README.md | redarrowlabs/pubnub-c-sharp | 9d82a7a902080d65dac508328ff156b23b6cb3de | [
"MIT"
] | null | null | null | pocketpc/README.md | redarrowlabs/pubnub-c-sharp | 9d82a7a902080d65dac508328ff156b23b6cb3de | [
"MIT"
] | null | null | null | # PubNub 3.7 Web Data Push Cloud-Hosted API
# Pocket PC /Windows CE SDK
Open PubNub-Messaging\PubNub-Messaging.sln
Run the project in "Pocket PC 2003 SE Square Emulator" to see a working example targeting .Net Compact Framework 2.0. The main core functionality lies in the PubNub-Messaging.csproj project.
## Requirements
1. Windows XP OS or Windows XP Mode on Windows 7
2. Visual Studio 2008
3. Windows Mobile 6 Professional SDK Refresh
4. .NET Compact Framework 2.0 SP2
## Third party software/source code used
1. Newtonsoft.Json.Compact library code
2. The Bouncy Castle Cryptographic C# API
3. ConcurrentHashtable (TvdP.Collections)
4. NUnitLiteCF
### Object Cleanup
For best performance after completion of all intended operations, please call the EndPendingRequests() method of the Pubnub instance, and assign it to null. This will help ensure speedy resources cleanup when you are done with the object.
## Running the emulator
1. To run Pocket PC / Windows CE SDK, you need Windows XP OS with Visual Studio 2008.
2. From Device Emulator Manager, right click on "Pocket PC 2003 SE Square Emulator" and connect. Once connected, right click and select "Cradle". A "New Partndership" windows will popup and select "Guest Partnership" and click next for connet.
3. ActiveSync window displays "Connected" message.
4. Ensure that the date and time is correct by going to emulator Settings -> System -> Clock & Alarms. This is very important for PAM feature.
## Running the Demo/Example App
1. Open up the solution file and select "Pocket PC 2003 SE Square Emulator" as target device.
2. Right click on PubNubMessagingExample project, and set as Startup Project
3. Build the project. Please make sure you compile the whole solution because third party libraries are added as project reference.
4. CTRL-F5 or F5 to run it.
5. Demo will run on Pocket PC emulator
## Running the Tests
1. Open up the solution file and select "Pocket PC 2003 SE Square Emulator" as target device
2. Right click on PubNub-Messaging.Tests project, and set as Startup Project
3. Build the project. Please make sure you compile the whole solution because third party libraries are added as project reference.
4. CTRL-F5 or F5 to run it.
5. Unit tests will run on Pocket PC emulator
## Known issues
1. Occasionally due to concurrent web requests, the JSON response of one web request may come as response of other web request. A work around fix is in place to handle the known unexpected JSON response.
Report an issue, or email us at support if there are any additional questions or comments. | 52.673469 | 243 | 0.78458 | eng_Latn | 0.980846 |
0ca3b08873a41d747da52a580b0ab7cf7ef76062 | 4,477 | md | Markdown | articles/storage/scripts/storage-windows-powershell-sample-create-managed-disk-from-vhd.md | diablo444/azure-docs.de-de | 168079679b8171e6c2b6957d21d581f05752689d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/storage/scripts/storage-windows-powershell-sample-create-managed-disk-from-vhd.md | diablo444/azure-docs.de-de | 168079679b8171e6c2b6957d21d581f05752689d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/storage/scripts/storage-windows-powershell-sample-create-managed-disk-from-vhd.md | diablo444/azure-docs.de-de | 168079679b8171e6c2b6957d21d581f05752689d | [
"CC-BY-4.0",
"MIT"
] | 1 | 2022-01-21T14:22:47.000Z | 2022-01-21T14:22:47.000Z | ---
title: "Azure PowerShell-Beispielskript: Erstellen verwalteter Datenträger aus einer VHD-Datei in einem Speicherkonto in demselben oder einem anderen Abonnement | Microsoft-Dokumentation"
description: "Azure PowerShell-Beispielskript: Erstellen verwalteter Datenträger aus einer VHD-Datei in einem Speicherkonto in demselben oder einem anderen Abonnement"
services: virtual-machines-windows
documentationcenter: storage
author: ramankumarlive
manager: kavithag
editor: tysonn
tags: azure-service-management
ms.assetid:
ms.service: virtual-machines-windows
ms.devlang: na
ms.topic: sample
ms.tgt_pltfrm: vm-windows
ms.workload: infrastructure
ms.date: 06/05/2017
ms.author: ramankum
ms.translationtype: Human Translation
ms.sourcegitcommit: 09f24fa2b55d298cfbbf3de71334de579fbf2ecd
ms.openlocfilehash: 079ab69da3c2cb6cc38f0a766ffeb47ed84c90a3
ms.contentlocale: de-de
ms.lasthandoff: 06/07/2017
---
# <a name="create-a-managed-disk-from-a-vhd-file-in-a-storage-account-in-same-or-different-subscription-with-powershell"></a>Erstellen verwalteter Datenträger aus einer VHD-Datei in einem Speicherkonto in demselben oder einem anderen Abonnement mithilfe von PowerShell
Dieses Skript erstellt einen verwalteten Datenträger aus einer VHD-Datei in einem Speicherkonto in demselben oder einem anderen Abonnement. Sie können dieses Skript verwenden, um eine bestimmte (nicht generalisierte/mit Sysprep vorbereitete) VHD auf den verwalteten Betriebssystem-Datenträger zu importieren, um damit einen virtuellen Computer zu erstellen. Außerdem können Sie damit eine Daten-VHD auf verwaltete Datenträger importieren.
Erstellen Sie nicht mehrere identische verwaltete Datenträger aus einer VHD-Datei in einem kurzen Zeitraum. Um verwaltete Datenträger aus einer VHD-Datei zu erstellen, wird eine Blob-Momentaufnahme der VHD-Datei erstellt und anschließend verwendet, um verwaltete Datenträger zu erstellen. Innerhalb einer Minute kann nur eine Blob-Momentaufnahme erstellt werden, die aufgrund der Drosselung Fehler bei der Datenträgererstellung auslösen kann. Erstellen Sie eine verwaltete Momentaufnahme aus der VHD-Datei, um diese Drosselung zu vermeiden (weitere Informationen finden Sie unter [Create a snapshot from a VHD](./../scripts/storage-windows-powershell-sample-create-snapshot-from-vhd.md?toc=%2fpowershell%2fmodule%2ftoc.json)). Verwenden Sie anschließend die verwaltete Momentaufnahme, um in einer kurzen Zeitspanne mehrere verwaltete Datenträger zu erstellen.
[!INCLUDE [sample-powershell-install](../../../includes/sample-powershell-install.md)]
[!INCLUDE [quickstarts-free-trial-note](../../../includes/quickstarts-free-trial-note.md)]
## <a name="sample-script"></a>Beispielskript
[!code-powershell[main](../../../powershell_scripts/storage/create-managed-disks-from-vhd-in-different-subscription/create-managed-disks-from-vhd-in-different-subscription.ps1 "Verwaltete Datenträger aus VHDs erstellen")]
## <a name="script-explanation"></a>Erläuterung des Skripts
Dieses Skript verwendet die folgenden Befehle, um einen verwalteten Datenträger aus einer VHD in verschiedenen Abonnements zu erstellen. Jeder Befehl in der Tabelle ist mit der zugehörigen Dokumentation verknüpft.
| Befehl | Hinweise |
|---|---|
| [New-AzureRmDiskConfig](/powershell/module/azurerm.compute/New-AzureRmDiskConfig) | Erstellt die Datenträgerkonfiguration, die für die Datenträgererstellung verwendet wird. Dies umfasst den Speichertyp, den Speicherort, die Ressourcen-ID des Speicherkontos, in dem die übergeordnete VHD und VHD-URI der übergeordneter VHD-Datei gespeichert wird. |
| [New-AzureRmDisk](/powershell/module/azurerm.compute/New-AzureRmDisk) | Erstellt einen Datenträger mit Datenträgerkonfiguration, Datenträgername und Name der Ressourcengruppe, die als Parameter übergeben werden. |
## <a name="next-steps"></a>Nächste Schritte
[Erstellen eines virtuellen Computers mit einem vorhandenen verwalteten Betriebssystemdatenträger mit PowerShell](./../../virtual-machines/scripts/virtual-machines-windows-powershell-sample-create-vm-from-managed-os-disks.md?toc=%2fpowershell%2fmodule%2ftoc.json)
Weitere Informationen zum Azure PowerShell-Modul finden Sie in der [Azure PowerShell-Dokumentation](/powershell/azure/overview).
Zusätzliche VM-PowerShell-Skriptbeispiele finden Sie in der [Dokumentation zu Windows-VMs in Azure](../../virtual-machines/windows/powershell-samples.md?toc=%2fazure%2fvirtual-machines%2fwindows%2ftoc.json).
| 78.54386 | 860 | 0.816842 | deu_Latn | 0.96851 |
0ca3c44974a637c859d462dac21f55ae8805b4fd | 6,448 | md | Markdown | articles/finance/accounts-payable/default-offset-accounts-vendor-invoice-journals.md | MicrosoftDocs/Dynamics-365-Operations.fr-fr | 9f97b0553ee485dfefc0a57ce805f740f4986a7e | [
"CC-BY-4.0",
"MIT"
] | 2 | 2020-05-18T17:14:08.000Z | 2021-04-20T21:13:46.000Z | articles/finance/accounts-payable/default-offset-accounts-vendor-invoice-journals.md | MicrosoftDocs/Dynamics-365-Operations.fr-fr | 9f97b0553ee485dfefc0a57ce805f740f4986a7e | [
"CC-BY-4.0",
"MIT"
] | 6 | 2017-12-13T18:31:58.000Z | 2019-04-30T11:46:19.000Z | articles/finance/accounts-payable/default-offset-accounts-vendor-invoice-journals.md | MicrosoftDocs/Dynamics-365-Operations.fr-fr | 9f97b0553ee485dfefc0a57ce805f740f4986a7e | [
"CC-BY-4.0",
"MIT"
] | 1 | 2019-10-12T18:19:20.000Z | 2019-10-12T18:19:20.000Z | ---
title: Comptes de contrepartie par défaut pour les journaux de facture fournisseur et les journaux d’approbation de facture
description: Cette rubrique vous aide à décider où affecter les comptes par défaut pour les journaux des factures.
author: abruer
ms.date: 01/12/2018
ms.topic: article
ms.prod: ''
ms.technology: ''
ms.search.form: LedgerJournalTable
audience: Application User
ms.reviewer: roschlom
ms.custom: 62093
ms.assetid: 553933ca-928d-4031-bb8c-f9cff458320b
ms.search.region: global
ms.author: shpandey
ms.search.validFrom: 2016-02-28
ms.dyn365.ops.version: AX 7.0.0
ms.openlocfilehash: e1b0184850602191da5448df25779437f70e5c3182e1b7b70d92d4c406e08599
ms.sourcegitcommit: 42fe9790ddf0bdad911544deaa82123a396712fb
ms.translationtype: HT
ms.contentlocale: fr-FR
ms.lasthandoff: 08/05/2021
ms.locfileid: "6749026"
---
# <a name="default-offset-accounts-for-vendor-invoice-and-invoice-approval-journals"></a>Comptes de contrepartie par défaut pour les journaux de facture fournisseur et les journaux d’approbation de facture
[!include [banner](../includes/banner.md)]
Les comptes de contrepartie par défaut sont utilisés dans les pages suivantes de journal des factures fournisseur :
- Journal des factures
- Journal des approbations de facture
Utilisez le tableau suivant pour vous aider à décider où affecter les comptes par défaut pour les journaux des factures.
<table>
<colgroup>
<col width="25%" />
<col width="25%" />
<col width="25%" />
<col width="25%" />
</colgroup>
<thead>
<tr class="header">
<th>Paramétrez des comptes par défaut ici...</th>
<th>... pour fournir les comptes par défaut ici</th>
<th>Comment cette option affecte le traitement</th>
<th>Quand utiliser cette option</th>
</tr>
</thead>
<tbody>
<tr class="odd">
<td><strong>Groupe de fournisseurs</strong> – Paramétrez des comptes de contrepartie par défaut pour des groupes de fournisseurs sur la page <strong>Paramétrage du compte par défaut</strong>, que vous pouvez ouvrir depuis la page <strong>Groupes de fournisseurs</strong>.</td>
<td><ul>
<li>Compte fournisseur</li>
<li>Entrées de journal pour les comptes fournisseur dans le groupe de fournisseurs, si des comptes par défaut ne sont pas spécifiés pour les comptes fournisseur</li>
</ul></td>
<td>Les comptes de contrepartie par défaut pour les groupes de fournisseurs sont affichés comme des comptes de contrepartie par défaut pour les fournisseurs sur la page <strong>Paramétrage du compte par défaut</strong>. Vous pouvez ouvrir cette page depuis la page de la liste <strong>Tous les fournisseurs</strong>.</td>
<td>Utilisez cette option si vous payez généralement le même type de choses auprès des mêmes groupes de fournisseurs.</td>
</tr>
<tr class="even">
<td><strong>Compte fournisseur</strong> – Paramétrez les comptes par défaut pour les comptes fournisseur sur la page <strong>Paramétrage du compte par défaut</strong>, que vous pouvez ouvrir depuis la page <strong>Fournisseurs</strong>.</td>
<td>Entrées de journal pour le compte fournisseur</td>
<td>Les comptes de contrepartie par défaut pour les comptes fournisseur sont affichés comme des comptes de contrepartie par défaut pour les entrées de journal du compte fournisseur.</td>
<td>Utilisez cette option si vous payez généralement le même type de choses auprès des mêmes fournisseurs.</td>
</tr>
<tr class="odd">
<td><strong>Noms de journal</strong> – Paramétrez des comptes de contrepartie par défaut pour les journaux sur la page <strong>Noms de journal</strong>. Sélectionnez l’option <strong>Compte de contrepartie fixe</strong>. Notez que vous ne pouvez pas spécifier de comptes de contrepartie par défaut dans les noms de journal si le type de journal des noms de journal est <strong>Registre des factures</strong> ou <strong>Approbation</strong>.</td>
<td><ul>
<li>En-tête de journal utilisant le nom de journal</li>
<li>Entrées de journal dans les journaux qui utilisent le nom de journal</li>
</ul></td>
<td>Si l’option <strong>Compte de contrepartie fixe</strong> de la page <strong>Noms de journal</strong> est sélectionnée, le compte de contrepartie pour le nom de journal remplace le compte de contrepartie par défaut pour le fournisseur ou le groupe de fournisseurs.</td>
<td>Utilisez cette option pour paramétrer des journaux pour les coûts spécifiques et les dépenses facturées sur des comptes spécifiques, indépendamment du fournisseur ou du groupe de fournisseurs dont le fournisseur fait partie.</td>
</tr>
<tr class="even">
<td><strong>Noms de journal</strong> – Paramétrez des comptes de contrepartie par défaut pour les journaux sur la page <strong>Noms de journal</strong>. Désactivez l’option <strong>Compte de contrepartie fixe</strong>. Notez que vous ne pouvez pas spécifier de comptes de contrepartie par défaut dans les noms de journal si le type de journal des noms de journal est <strong>Registre des factures</strong> ou <strong>Approbation</strong>.</td>
<td><ul>
<li>En-tête de journal</li>
<li>Entrées de journal dans les journaux qui utilisent le nom de journal</li>
</ul></td>
<td>Ces entrées par défaut sont utilisées dans des écrans d’en-tête de journal, et le compte de contrepartie dans l’écran d’en-tête de journal est utilisé comme entrée par défaut sur la pages de justificatif de journal. Les comptes par défaut de la page <strong>Noms de journal </strong>sont utilisés uniquement si les comptes par défaut ne sont pas paramétrés pour le compte fournisseur.</td>
<td>Utilisez cette option pour paramétrer des comptes par défaut à utiliser lorsqu’un compte de contrepartie par défaut du fournisseur n’est pas affecté.</td>
</tr>
<tr class="odd">
<td><strong>En-tête de journal</strong> – Paramétrez un compte de contrepartie par défaut pour un journal à utiliser comme entrée par défaut dans les pages de justificatif de journal. Notez que vous ne pouvez pas spécifier de comptes de contrepartie par défaut sur l’en-tête de journal si le type de journal des noms de journal est <strong>Registre des factures</strong> ou <strong>Approbation</strong>.</td>
<td>Entrées de journal dans le journal</td>
<td>Le compte de contrepartie par défaut pour un journal est utilisé comme entrée par défaut sur les pages de N° de justificatif de journal.</td>
<td>Utilisez cette option pour accélérer la saisie des données si la plupart des entrées d’un journal ont le même compte de contrepartie.</td>
</tr>
</tbody>
</table>
[!INCLUDE[footer-include](../../includes/footer-banner.md)] | 64.48 | 445 | 0.780862 | fra_Latn | 0.985697 |
0ca3f0e04db84a0d1b1da1288b68d7c8048e943a | 2,933 | md | Markdown | _posts/2020-12-03-LC73.md | yukinj/yukou | 3739dbea2e4ca10117ee271aa4ce9d6b031f5ec4 | [
"MIT"
] | null | null | null | _posts/2020-12-03-LC73.md | yukinj/yukou | 3739dbea2e4ca10117ee271aa4ce9d6b031f5ec4 | [
"MIT"
] | null | null | null | _posts/2020-12-03-LC73.md | yukinj/yukou | 3739dbea2e4ca10117ee271aa4ce9d6b031f5ec4 | [
"MIT"
] | null | null | null | ---
layout: post
title: 73. Set Matrix Zeroes
gh-badge: [star, fork, follow]
tags: [Array]
comments: true
---
```python
# time O(N+M) space O(MN)
class Solution:
def setZeroes(self, matrix: List[List[int]]) -> None:
"""
Do not return anything, modify matrix in-place instead.
"""
seen = set()
q = []
row = len(matrix)
col = len(matrix[0])
for i in range(row):
for j in range(col):
if matrix[i][j] == 0:
q.append((i,j))
seen.add((i,j))
for each in q:
x = each[0]
y = each[1]
# row set to 0
for i in range(row):
if (i,y) not in seen:
seen.add((i,y))
matrix[i][y] = 0
for j in range(col):
if (x,j) not in seen:
seen.add((x,j))
matrix[x][j] = 0
```
optimal solution
```python
# time O(MN) space O(M+N)
class Solution(object):
def setZeroes(self, matrix):
"""
:type matrix: List[List[int]]
:rtype: void Do not return anything, modify matrix in-place instead.
"""
R = len(matrix)
C = len(matrix[0])
rows, cols = set(), set()
# Essentially, we mark the rows and columns that are to be made zero
for i in range(R):
for j in range(C):
if matrix[i][j] == 0:
rows.add(i)
cols.add(j)
# Iterate over the array once again and using the rows and cols sets, update the elements
for i in range(R):
for j in range(C):
if i in rows or j in cols:
matrix[i][j] = 0
```
optimization: Space O(1) time O(R*C)
```python
class Solution(object):
def setZeroes(self, matrix):
"""
:type matrix: List[List[int]]
:rtype: void Do not return anything, modify matrix in-place instead.
"""
is_col = False
R = len(matrix)
C = len(matrix[0])
# Essentially, we mark the rows and columns that are to be made zero
for i in range(R):
if matrix[i][0]==0:
is_col = True
for j in range(1,C):
if matrix[i][j] == 0:
matrix[i][0] = 0
matrix[0][j] = 0
# Iterate over the array once again and using the rows and cols sets, update the elements
for i in range(1,R):
for j in range(1,C):
if matrix[i][0] == 0 or matrix[0][j] == 0:
matrix[i][j] = 0
if matrix[0][0] == 0:
for j in range(C):
matrix[0][j] = 0
if is_col:
for i in range(R):
matrix[i][0] = 0
``` | 27.157407 | 105 | 0.446983 | eng_Latn | 0.929001 |
0ca41787e2c0ebb52913d4d30113fea11e394b3f | 631 | md | Markdown | src/markdown-pages/decision-to-canonize-junipero-serra-draws-divided-reaction.md | csumb-archives/news-gatsby | ef59f9a219e94123b306ebba3a856f78612446a5 | [
"MIT"
] | 1 | 2020-09-09T22:40:34.000Z | 2020-09-09T22:40:34.000Z | src/markdown-pages/decision-to-canonize-junipero-serra-draws-divided-reaction.md | csumb-archives/news-gatsby | ef59f9a219e94123b306ebba3a856f78612446a5 | [
"MIT"
] | 4 | 2020-08-12T21:16:50.000Z | 2020-10-02T19:21:15.000Z | src/markdown-pages/decision-to-canonize-junipero-serra-draws-divided-reaction.md | csumb-archives/news-gatsby | ef59f9a219e94123b306ebba3a856f78612446a5 | [
"MIT"
] | null | null | null | ---
slug: decision-to-canonize-junipero-serra-draws-divided-reaction
title: "Decision to canonize Junipero Serra draws divided reaction"
date: January 01 2020
---
<p>
Others such as Ruben Mendoza, coordinator of California mission archaeology at
Cal State Monterey Bay, say the canonization is long overdue. "I've always
felt the canonization process was stymied through misinformation and
politicization, and laying blame and onus on one individual who was actually
in constant conflict with governors and military commanders in New Spain over
how they were treating Indians." – Seattle Times, Jan. 17, 2015
</p>
| 37.117647 | 80 | 0.7813 | eng_Latn | 0.998071 |
0ca449e602f80198b71bef652a7f0ab67d285dfe | 7,115 | md | Markdown | docs/4.9/concepts/arg-resolvers.md | jandrodev/lighthouse | a25df1b476fe628f7997f724a199a58d3156476c | [
"MIT"
] | 1 | 2020-09-18T04:22:37.000Z | 2020-09-18T04:22:37.000Z | docs/4.9/concepts/arg-resolvers.md | jandrodev/lighthouse | a25df1b476fe628f7997f724a199a58d3156476c | [
"MIT"
] | null | null | null | docs/4.9/concepts/arg-resolvers.md | jandrodev/lighthouse | a25df1b476fe628f7997f724a199a58d3156476c | [
"MIT"
] | 2 | 2019-06-02T15:49:55.000Z | 2021-01-08T04:51:16.000Z | # Arg Resolvers
To understand the concept behind arg resolvers, you should familiarize yourself with
[how field resolvers are composed](https://graphql.org/learn/execution/).
## Motivation
Arg resolvers are an extension of the ideas behind GraphQL field execution,
applied to input arguments. Since GraphQL queries can be used to fetch complex
and deeply nested data from the client, it is natural to assume that such complex
data can also be passed as the input arguments to a query.
GraphQL's execution engine allows you to write small and focused field resolver functions
that only care about returning the data that it is immediately responsible for.
That makes the code much simpler and avoids duplication.
However, a single field resolver still has to take care of all the input arguments that
are passed to it. Handling complex input data in a single function is hard because of their
dynamic nature. The input given by a client might be nested arbitrarily deep
and come in many different variations.
The following example shows an example mutation that is actually composed out of multiple
distinct operations.
```graphql
type Mutation {
createTask(input: CreateTaskInput): Task!
}
input CreateTaskInput {
name: String!
notes: [CreateNoteInput!]
}
input CreateNoteInput {
content: String!
link: String
}
```
In a single request, we can pass all data relating to a task,
including related entities such as notes.
```graphql
mutation CreateTaskWithNotes {
createTask(
id: 45
name: "Do something"
notes: [
{ content: "Foo bar", link: "http://foo.bar" }
{ content: "Awesome note" }
]
) {
id
}
}
```
We might resolve that mutation by writing a resolver function that handles all input at once.
```php
function createTaskWithNotes($root, array $args): \App\Models\Task {
// Pull and remove notes from the args array
$notes = \Illuminate\Support\Arr::pull($args, 'notes');
// Create the new task with the remaining args
$task = \App\Models\Task::create($args);
// If the client actually passed notes, create and attach them
if($notes) {
foreach($notes as $note) {
$task->notes()->create($note);
}
}
return $task;
}
```
In this contrived example, the function is still pretty simple. However, separation of concerns
is already violated: A single function is responsible for creating both tasks and notes.
We might want to extend our schema to support more operations in the future, such as updating
a task and creating, updating or deleting notes or other, more deeply nested relations.
Such changes would force us to duplicate code and increase the complexity of our single function.
## Solution
Ideally, we would want to write small and focused functions that each deal with just
a part of the given input arguments. The execution engine should traverse the given
input and take care of calling the appropriate functions with their respective arguments.
```php
function createTask($root, array $args): \App\Models\Task {
return \App\Models\Task::create($args);
}
function createTaskNotes(\App\Models\Task $root, array $args): void {
foreach($args as $note) {
$root->notes()->create($note);
}
}
```
Lighthouse allows you to attach resolver functions to arguments.
Complex inputs are automatically split into smaller pieces and passed off to the responsible function.
As Lighthouse uses the SDL as the primary building block, arg resolvers are implemented as directives.
Here is how we can define a schema that enables sending a nested mutation as in the example above.
```diff
type Mutation {
- createTask(input: CreateTaskInput): Task!
+ createTask(input: CreateTaskInput): Task! @create
}
input CreateTaskInput {
name: String!
- notes: [CreateNoteInput!]
+ notes: [CreateNoteInput!] @create
}
input CreateNoteInput {
content: String!
link: String
}
```
The `@create` directive will behave differently, based on the context where it is used.
On the `createTask` field, it will create a `Task` model with the given `name`, save it
to the database and return that instance to Lighthouse.
A simplified, generic implementation of an appropriate field resolver would look something like this:
```php
<?php
namespace Nuwave\Lighthouse\Schema\Directives;
use Illuminate\Database\Eloquent\Model;
use Nuwave\Lighthouse\Execution\Arguments\ResolveNested;
use Nuwave\Lighthouse\Schema\Values\FieldValue;
use Nuwave\Lighthouse\Support\Contracts\FieldResolver;
use Nuwave\Lighthouse\Support\Contracts\GraphQLContext;
class CreateDirective extends BaseDirective implements FieldResolver
{
public function resolveField(FieldValue $fieldValue)
{
return $fieldValue->setResolver(
function ($root, array $args, GraphQLContext $context, ResolveInfo $resolveInfo): Model {
// Wrap the operation and let Lighthouse take care of splitting the input
$nestedSave = new ResolveNested(function($model, $args) {
$model->fill($args->toArray());
$model->save();
});
$modelClass = $this->getModelClass();
/** @var \Illuminate\Database\Eloquent\Model $model */
$model = new $modelClass;
return $nestedSave($model, $resolveInfo->argumentSet);
}
);
}
}
```
The arguments that are nested within `notes` will be handled as a nested argument resolver.
For each `CreateNoteInput`, the resolver will be called with the previously created `Task`
and create and attach a related `Note` model.
We can extend our previous implementation of `@create` by allowing it to be used as an `ArgResolver`:
```php
<?php
namespace Nuwave\Lighthouse\Schema\Directives;
use Illuminate\Database\Eloquent\Model;
use Nuwave\Lighthouse\Execution\Arguments\ResolveNested;
use Nuwave\Lighthouse\Schema\Values\FieldValue;
use Nuwave\Lighthouse\Support\Contracts\ArgResolver;
use Nuwave\Lighthouse\Support\Contracts\FieldResolver;
use Nuwave\Lighthouse\Support\Contracts\GraphQLContext;
class CreateDirective extends BaseDirective implements FieldResolver, ArgResolver
{
public function resolveField(FieldValue $fieldValue) { ... }
/**
* @param \Illuminate\Database\Eloquent\Model $parent
* @param array<\Nuwave\Lighthouse\Execution\Arguments\ArgumentSet> $argsList
* @return array<\Illuminate\Database\Eloquent\Model>
*/
public function __invoke($parent, $argsList): array
{
$relationName = $this->getRelationName();
/** @var \Illuminate\Database\Eloquent\Relations\Relation $relation */
$relation = $parent->{$relationName}();
$related = $relation->make();
return array_map(
function ($args) use ($related) {
$related->fill($args->toArray());
$related->save();
},
$argsList
);
}
}
```
You may define your own nested arg resolver directives by implementing [`ArgResolver`](../custom-directives/argument-directives.md#argresolver).
| 32.788018 | 144 | 0.719185 | eng_Latn | 0.982819 |
0ca44cfe7c75492e8ec8db8af84f235726afac54 | 76 | md | Markdown | README.md | qbicsoftware/dmqb-grp2-2018 | 7a471a396c8cb14bfd924c783cf12beaa4c89f12 | [
"MIT"
] | null | null | null | README.md | qbicsoftware/dmqb-grp2-2018 | 7a471a396c8cb14bfd924c783cf12beaa4c89f12 | [
"MIT"
] | null | null | null | README.md | qbicsoftware/dmqb-grp2-2018 | 7a471a396c8cb14bfd924c783cf12beaa4c89f12 | [
"MIT"
] | null | null | null | # dmqb-grp2-2018
The project assignment work of group 2 in the lecture DMQB
| 25.333333 | 58 | 0.789474 | eng_Latn | 0.993297 |
0ca569de4b715a061edfc804d63abd8ae83b9310 | 2,948 | md | Markdown | docs/debugger/message-options-dialog-box.md | tommorris/visualstudio-docs.fr-fr | dd3606399fd617d5584bbf08bbe616bbdbb36401 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/debugger/message-options-dialog-box.md | tommorris/visualstudio-docs.fr-fr | dd3606399fd617d5584bbf08bbe616bbdbb36401 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/debugger/message-options-dialog-box.md | tommorris/visualstudio-docs.fr-fr | dd3606399fd617d5584bbf08bbe616bbdbb36401 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Boîte de dialogue Options de message | Documents Microsoft
ms.custom: ''
ms.date: 11/04/2016
ms.technology: vs-ide-debug
ms.topic: reference
helpviewer_keywords:
- message options, Spy++
- Spy++, message options
ms.assetid: 88ad85af-3f56-4df1-98b6-fab34c1e5874
author: mikejo5000
ms.author: mikejo
manager: douge
ms.workload:
- multiple
ms.openlocfilehash: 76ced120b9545adb21a4fcfe73df6f69961568b7
ms.sourcegitcommit: 3d10b93eb5b326639f3e5c19b9e6a8d1ba078de1
ms.translationtype: MT
ms.contentlocale: fr-FR
ms.lasthandoff: 04/18/2018
ms.locfileid: "31474919"
---
# <a name="message-options-dialog-box"></a>Options des messages, boîte de dialogue
Utilisez cette boîte de dialogue pour sélectionner les messages répertoriés dans [vue Messages](../debugger/messages-view.md). Pour afficher cette boîte de dialogue, choisissez **des Messages de journal** à partir de la **Spy** menu.
## <a name="in-this-section"></a>Dans cette section
[Onglet Fenêtres de la boîte de dialogue Options des messages](../debugger/windows-tab-message-options-dialog-box.md)
Permet de sélectionner les types de messages à la liste. Inclut l’outil de recherche.
[Onglet Messages de la boîte de dialogue Options des messages](../debugger/messages-tab-message-options-dialog-box.md)
Permet de sélectionner les types de message à afficher.
[Onglet Sortie de la boîte de dialogue Options des messages](../debugger/output-tab-message-options-dialog-box.md)
Permet de spécifier les données de message à afficher.
## <a name="related-sections"></a>Rubriques connexes
[Recherche d’un message, boîte de dialogue](../debugger/message-search-dialog-box.md)
Utilisé pour rechercher le nœud pour un message spécifique dans la vue messages.
[Propriétés du message, boîte de dialogue](../debugger/message-properties-dialog-box.md)
Permet d’afficher les propriétés d’un message sélectionné dans la vue messages.
[Informations de référence sur Spy++](../debugger/spy-increment-reference.md)
Inclut des sections décrivant chaque Spy ++ menu et boîte de dialogue.
[Recherche d’un Message dans la vue Messages](../debugger/how-to-search-for-a-message-in-messages-view.md)
Explique comment rechercher un message spécifique dans la vue Messages.
[Affichage des Messages lors de l’ouverture à partir de la fenêtre Rechercher](../debugger/how-to-open-messages-view-from-find-window.md)
Explique comment ouvrir la vue Messages à partir de la boîte de dialogue Rechercher une fenêtre.
[Vue Messages](../debugger/messages-view.md)
Affiche le flux de message associé à une fenêtre, processus ou thread.
[Vues Spy++](../debugger/spy-increment-views.md)
Explique les arborescences Spy ++ de windows, les messages, les processus et les threads.
[Utilisation de Spy++](../debugger/using-spy-increment.md)
Présente l’outil Spy ++ et explique comment elle peut être utilisée. | 49.966102 | 235 | 0.755427 | fra_Latn | 0.752131 |
0ca570d34ac03b074fd9e954cd97f28392b7383f | 6,681 | md | Markdown | articles/event-hubs/event-hubs-ip-filtering.md | MikeRys/azure-docs | fce73bb0e41648b28b5f766a9c0ccd7070124c5b | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/event-hubs/event-hubs-ip-filtering.md | MikeRys/azure-docs | fce73bb0e41648b28b5f766a9c0ccd7070124c5b | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/event-hubs/event-hubs-ip-filtering.md | MikeRys/azure-docs | fce73bb0e41648b28b5f766a9c0ccd7070124c5b | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Azure Event Hubs Firewall Rules | Microsoft Docs
description: Use Firewall Rules to allow connections from specific IP addresses to Azure Event Hubs.
ms.topic: article
ms.date: 07/16/2020
---
# Configure IP firewall rules for an Azure Event Hubs namespace
By default, Event Hubs namespaces are accessible from internet as long as the request comes with valid authentication and authorization. With IP firewall, you can restrict it further to only a set of IPv4 addresses or IPv4 address ranges in [CIDR (Classless Inter-Domain Routing)](https://en.wikipedia.org/wiki/Classless_Inter-Domain_Routing) notation.
This feature is helpful in scenarios in which Azure Event Hubs should be only accessible from certain well-known sites. Firewall rules enable you to configure rules to accept traffic originating from specific IPv4 addresses. For example, if you use Event Hubs with [Azure Express Route][express-route], you can create a **firewall rule** to allow traffic from only your on-premises infrastructure IP addresses.
>[!WARNING]
> Enabling IP filtering can prevent other Azure services from interacting with Event Hubs.
>
> Trusted Microsoft services are not supported when Virtual Networks are implemented.
>
> Common Azure scenarios that don't work with Virtual Networks (note that the list is **NOT** exhaustive) -
> - Azure Stream Analytics
> - Integration with Azure Event Grid
> - Azure IoT Hub Routes
> - Azure IoT Device Explorer
>
> The following Microsoft services are required to be on a virtual network
> - Azure Web Apps
> - Azure Functions
## IP firewall rules
The IP firewall rules are applied at the Event Hubs namespace level. Therefore, the rules apply to all connections from clients using any supported protocol. Any connection attempt from an IP address that does not match an allowed IP rule on the Event Hubs namespace is rejected as unauthorized. The response does not mention the IP rule. IP filter rules are applied in order, and the first rule that matches the IP address determines the accept or reject action.
## Use Azure portal
This section shows you how to use the Azure portal to create IP firewall rules for an Event Hubs namespace.
1. Navigate to your **Event Hubs namespace** in the [Azure portal](https://portal.azure.com).
2. On the left menu, select **Networking** option. If you select the **All networks** option, the event hub accepts connections from any IP address. This setting is equivalent to a rule that accepts the 0.0.0.0/0 IP address range.

1. To restrict access to specific networks and IP addresses, select the **Selected networks** option. In the **Firewall** section, follow these steps:
1. Select **Add your client IP address** option to give your current client IP the access to the namespace.
2. For **address range**, enter a specific IPv4 address or a range of IPv4 address in CIDR notation.
3. Specify whether you want to **allow trusted Microsoft services to bypass this firewall**.
> [!WARNING]
> If you choose the **Selected networks** option and don't specify an IP address or address range, the service will allow traffic from all networks.

3. Select **Save** on the toolbar to save the settings. Wait for a few minutes for the confirmation to show up on the portal notifications.
## Use Resource Manager template
> [!IMPORTANT]
> Firewall rules are supported in **standard** and **dedicated** tiers of Event Hubs. It's not supported in basic tier.
The following Resource Manager template enables adding an IP filter rule to an existing Event Hubs namespace.
Template parameters:
- **ipMask** is a single IPv4 address or a block of IP addresses in CIDR notation. For example, in CIDR notation 70.37.104.0/24 represents the 256 IPv4 addresses from 70.37.104.0 to 70.37.104.255, with 24 indicating the number of significant prefix bits for the range.
> [!NOTE]
> While there are no deny rules possible, the Azure Resource Manager template has the default action set to **"Allow"** which doesn't restrict connections.
> When making Virtual Network or Firewalls rules, we must change the
> ***"defaultAction"***
>
> from
> ```json
> "defaultAction": "Allow"
> ```
> to
> ```json
> "defaultAction": "Deny"
> ```
>
```json
{
"$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"eventhubNamespaceName": {
"type": "string",
"metadata": {
"description": "Name of the Event Hubs namespace"
}
},
"location": {
"type": "string",
"metadata": {
"description": "Location for Namespace"
}
}
},
"variables": {
"namespaceNetworkRuleSetName": "[concat(parameters('eventhubNamespaceName'), concat('/', 'default'))]",
},
"resources": [
{
"apiVersion": "2018-01-01-preview",
"name": "[parameters('eventhubNamespaceName')]",
"type": "Microsoft.EventHub/namespaces",
"location": "[parameters('location')]",
"sku": {
"name": "Standard",
"tier": "Standard"
},
"properties": { }
},
{
"apiVersion": "2018-01-01-preview",
"name": "[variables('namespaceNetworkRuleSetName')]",
"type": "Microsoft.EventHub/namespaces/networkruleset",
"dependsOn": [
"[concat('Microsoft.EventHub/namespaces/', parameters('eventhubNamespaceName'))]"
],
"properties": {
"virtualNetworkRules": [<YOUR EXISTING VIRTUAL NETWORK RULES>],
"ipRules":
[
{
"ipMask":"10.1.1.1",
"action":"Allow"
},
{
"ipMask":"11.0.0.0/24",
"action":"Allow"
}
],
"trustedServiceAccessEnabled": false,
"defaultAction": "Deny"
}
}
],
"outputs": { }
}
```
To deploy the template, follow the instructions for [Azure Resource Manager][lnk-deploy].
## Next steps
For constraining access to Event Hubs to Azure virtual networks, see the following link:
- [Virtual Network Service Endpoints for Event Hubs][lnk-vnet]
<!-- Links -->
[express-route]: /azure/expressroute/expressroute-faqs#supported-services
[lnk-deploy]: ../azure-resource-manager/templates/deploy-powershell.md
[lnk-vnet]: event-hubs-service-endpoints.md
| 43.953947 | 463 | 0.69286 | eng_Latn | 0.970525 |
0ca577f0874606aa8f400784120fe72e936076dc | 2,702 | md | Markdown | articles/search/search-more-like-this.md | Almulo/azure-docs.es-es | f1916cdaa2952cbe247723758a13b3ec3d608863 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/search/search-more-like-this.md | Almulo/azure-docs.es-es | f1916cdaa2952cbe247723758a13b3ec3d608863 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/search/search-more-like-this.md | Almulo/azure-docs.es-es | f1916cdaa2952cbe247723758a13b3ec3d608863 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: moreLikeThis en Azure Search (versión preliminar) | Microsoft Docs
description: Documentación preliminar de la característica moreLikeThis (versión preliminar), expuesta en la API de REST de Azure Search.
authors: mhko
manager: jlembicz
services: search
ms.service: search
ms.devlang: rest-api
ms.topic: conceptual
ms.date: 10/27/2016
ms.author: nateko
ms.openlocfilehash: 29d9a478ca2e91e658d7d0f52e7a193ba694bc16
ms.sourcegitcommit: fa493b66552af11260db48d89e3ddfcdcb5e3152
ms.translationtype: HT
ms.contentlocale: es-ES
ms.lasthandoff: 04/23/2018
ms.locfileid: "31790736"
---
# <a name="morelikethis-in-azure-search-preview"></a>moreLikeThis en Azure Search (versión preliminar)
`moreLikeThis=[key]` es un parámetro de consulta de [API Search](https://docs.microsoft.com/rest/api/searchservice/search-documents). Al especificar el parámetro `moreLikeThis` en una consulta de búsqueda, puede buscar documentos que sean similares al especificado por la clave de documento. Cuando se realiza una solicitud de búsqueda con `moreLikeThis`, se genera una consulta con los términos de búsqueda extraídos del documento especificado que describen mejor ese documento. La consulta generada se usa luego para realizar la solicitud de búsqueda. De forma predeterminada, se tiene en cuenta el contenido de todos los campos `searchable` a menos que se use el parámetro `searchFields` para restringir los campos. El parámetro `moreLikeThis` no se puede usar con el parámetro de búsqueda, `search=[string]`.
## <a name="examples"></a>Ejemplos
A continuación se muestra un ejemplo de una consulta moreLikeThis. La consulta busca los documentos cuyos campos de descripción son lo más parecido al campo del documento de origen según se especifica con el parámetro `moreLikeThis`.
```
Get /indexes/hotels/docs?moreLikeThis=1002&searchFields=description&api-version=2016-09-01-Preview
```
```
POST /indexes/hotels/docs/search?api-version=2016-09-01-Preview
{
"moreLikeThis": "1002",
"searchFields": "description"
}
```
## <a name="feature-availability"></a>Disponibilidad de características
La característica moreLikeThis se encuentra actualmente en versión preliminar y solo se admite en las versiones de API de versión preliminar `2015-02-28-Preview` y `2016-09-01-Preview`. Puesto que la versión de la API se especifica en la solicitud, es posible combinar API de versión preliminar y disponibles en general en la misma aplicación. Sin embargo, la versión preliminar de las API no se someten a las condiciones del Acuerdo de Nivel de Servicio y sus características pueden cambiar, por lo que no se recomienda su uso en aplicaciones de producción. | 65.902439 | 813 | 0.786084 | spa_Latn | 0.974478 |
0ca5cf6e3377ac6fb66d3a1474a1794c4711d630 | 176 | md | Markdown | resources/moneropedia/reseed.md | ACDclientupdates/monero-site | 87d12d6d71c1218ec37283147a386f5e2a770c93 | [
"BSD-3-Clause"
] | null | null | null | resources/moneropedia/reseed.md | ACDclientupdates/monero-site | 87d12d6d71c1218ec37283147a386f5e2a770c93 | [
"BSD-3-Clause"
] | null | null | null | resources/moneropedia/reseed.md | ACDclientupdates/monero-site | 87d12d6d71c1218ec37283147a386f5e2a770c93 | [
"BSD-3-Clause"
] | null | null | null | ---
layout: moneropedia
title: titles.moneropedia
entry: moneropedia.entries.reseed
---
@moneropedia_article
{% t global.lang_tag %}
{% tf resources/moneropedia/reseed.md %}
| 16 | 40 | 0.75 | eng_Latn | 0.198946 |
0ca686cd84c2efd91651b62e483f7ecaf1fe024f | 24,776 | md | Markdown | articles/notification-hubs/notification-hubs-push-bing-spatial-data-geofencing-notification.md | gencomp/azure-docs.de-de | ea9dc9bb0bf0a7673d4f83d8a8187d55087b3bce | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/notification-hubs/notification-hubs-push-bing-spatial-data-geofencing-notification.md | gencomp/azure-docs.de-de | ea9dc9bb0bf0a7673d4f83d8a8187d55087b3bce | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/notification-hubs/notification-hubs-push-bing-spatial-data-geofencing-notification.md | gencomp/azure-docs.de-de | ea9dc9bb0bf0a7673d4f83d8a8187d55087b3bce | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Senden von Pushbenachrichtigungen mit Azure Notification Hubs und Bing Spatial Data | Microsoft-Dokumentation
description: In diesem Tutorial erfahren Sie, wie Sie standortbasierte Pushbenachrichtigungen mit Azure Notification Hubs und Bing Spatial Data bereitstellen.
services: notification-hubs
documentationcenter: windows
keywords: Pushbenachrichtigungen, Pushbenachrichtigungen
author: sethmanheim
manager: femila
editor: jwargo
ms.assetid: f41beea1-0d62-4418-9ffc-c9d70607a1b7
ms.service: notification-hubs
ms.workload: mobile
ms.tgt_pltfrm: mobile-windows-phone
ms.devlang: dotnet
ms.topic: tutorial
ms.custom: mvc
ms.date: 01/04/2019
ms.author: sethm
ms.reviewer: jowargo
ms.lastreviewed: 01/04/2019
ms.openlocfilehash: ff37a3ecb55c6ee034d3fd2558909c3b4ef1d375
ms.sourcegitcommit: f844603f2f7900a64291c2253f79b6d65fcbbb0c
ms.translationtype: HT
ms.contentlocale: de-DE
ms.lasthandoff: 07/10/2020
ms.locfileid: "86223430"
---
# <a name="tutorial-send-location-based-push-notifications-with-notification-hubs-and-bing-spatial-data"></a>Tutorial: Senden standortbasierter Pushbenachrichtigungen mit Notification Hubs und Bing Spatial Data
In diesem Tutorial erfahren Sie, wie Sie standortbasierte Pushbenachrichtigungen mit Azure Notification Hubs und Bing Spatial Data bereitstellen.
In diesem Tutorial führen Sie die folgenden Schritte aus:
> [!div class="checklist"]
> * Einrichten der Datenquelle
> * Einrichten der UWP-Anwendung
> * Einrichten des Back-Ends
> * Testen von Pushbenachrichtigungen in der UWP-App (Universelle Windows-Plattform)
## <a name="prerequisites"></a>Voraussetzungen
* **Azure-Abonnement**. Falls Sie über kein Azure-Abonnement verfügen, können Sie ein [kostenloses Azure-Konto erstellen](https://azure.microsoft.com/free/), bevor Sie beginnen.
* [Visual Studio 2015 Update 1](https://www.visualstudio.com/downloads/download-visual-studio-vs.aspx) oder höher ([Community Edition](https://go.microsoft.com/fwlink/?LinkId=691978&clcid=0x409)).
* Neueste Version des [Azure SDK](https://azure.microsoft.com/downloads/)
* [Bing Maps Dev Center-Konto](https://www.bingmapsportal.com/) (Sie können kostenlos ein Konto erstellen und Ihrem Microsoft-Konto zuordnen)
## <a name="set-up-the-data-source"></a>Einrichten der Datenquelle
1. Melden Sie sich beim [Bing Maps Dev Center](https://www.bingmapsportal.com/) an.
2. Klicken Sie in der oberen Navigationsleiste auf **Datenquellen**, und wählen Sie **Datenquellen verwalten** aus.

3. Wenn Sie noch nicht über eine Datenquelle verfügen, wird ein Link angezeigt, über den Sie eine Datenquelle erstellen können. Wählen Sie **Daten als Datenquelle hochladen** aus. Sie können auch das Menü **Datenquellen** > **Daten hochladen** verwenden.

4. Erstellen Sie auf Ihrer Festplatte eine Datei namens `NotificationHubsGeofence.pipe` mit folgendem Inhalt: In diesem Tutorial verwenden Sie eine pipebasierte Beispieldatei, die einen Teil des Hafengebiets von San Francisco eingrenzt:
```text
Bing Spatial Data Services, 1.0, TestBoundaries
EntityID(Edm.String,primaryKey)|Name(Edm.String)|Longitude(Edm.Double)|Latitude(Edm.Double)|Boundary(Edm.Geography)
1|SanFranciscoPier|||POLYGON ((-122.389825 37.776598,-122.389438 37.773087,-122.381885 37.771849,-122.382186 37.777022,-122.389825 37.776598))
```
Die Pipedatei stellt diese Entität dar:

5. Führen Sie auf der Seite **Datenquelle hochladen** die folgenden Aktionen aus:
1. Wählen Sie **Pipe** als **Datenformat** aus.
2. Navigieren Sie zu der Datei `NotificationHubGeofence.pipe`, die Sie im vorherigen Schritt erstellt haben, und wählen Sie sie aus.
3. Klicken Sie auf die Schaltfläche **Hochladen**.
> [!NOTE]
> Unter Umständen werden Sie aufgefordert, einen neuen Schlüssel als **Hauptschlüssel** anzugeben, der sich vom **Abfrageschlüssel** unterscheidet. Erstellen Sie im Dashboard einfach einen neuen Schlüssel, und aktualisieren Sie die Seite zum Hochladen der Datenquelle.
6. Nachdem Sie die Datendatei hochgeladen haben, müssen Sie sicherstellen, dass Sie die Datenquelle veröffentlichen. Wählen Sie wie zuvor **Datenquellen** -> **Datenquellen verwalten** aus.
7. Wählen Sie in der Liste Ihre Datenquelle aus, und klicken Sie in der Spalte **Aktionen** auf **Veröffentlichen**.

8. Wechseln Sie zur Registerkarte **Veröffentlichte Datenquellen**, und vergewissern Sie sich, dass Ihre Datenquelle in der Liste angezeigt wird.

9. Wählen Sie **Bearbeiten** aus. Sie sehen auf einen Blick, welche Standorte Sie in die Daten eingeschlossen haben.

An diesem Punkt werden die Grenzen für den von Ihnen erstellten Geofence nicht angezeigt. Sie benötigen lediglich die Bestätigung, dass sich der angegebene Standort in der richtigen Umgebung befindet.
10. Sie verfügen nun über alle Anforderungen für die Datenquelle. Wählen Sie zum Abrufen der Details zur Anforderungs-URL für den API-Aufruf im Bing Maps Dev Center die Option **Datenquellen** und dann **Datenquelleninformationen**.

Die **Abfrage-URL** ist der Endpunkt, für den Sie Abfragen ausführen können, um zu überprüfen, ob sich das Gerät derzeit innerhalb der Grenzen eines Standorts befindet. Zur Überprüfung führen Sie einfach einen GET-Aufruf mit den folgenden angefügten Parametern für die Abfrage-URL aus:
```text
?spatialFilter=intersects(%27POINT%20LONGITUDE%20LATITUDE)%27)&$format=json&key=QUERY_KEY
```
Bing Maps führt automatisch die notwendigen Berechnungen durch, um zu ermitteln, ob sich das Gerät innerhalb des Geofence befindet. Nachdem Sie die Anforderung über einen Browser (oder cURL) ausgeführt haben, erhalten Sie eine JSON-Standardantwort:

Diese Antwort erfolgt nur, wenn sich der Punkt tatsächlich innerhalb der festgelegten Grenzen befindet. Andernfalls erhalten Sie einen leeren **results**-Bucket:

## <a name="set-up-the-uwp-application"></a>Einrichten der UWP-Anwendung
1. Starten Sie in Visual Studio ein neues Projekt vom Typ **Leere App (Universelle Windows-App)** .

Nachdem die Projekterstellung abgeschlossen ist, sollten Sie über das Grundgerüst der App verfügen. Nun führen wir die Einrichtung für die Geofencing-Infrastruktur durch. Da Sie für diese Lösung Bing-Dienste verwenden, ist ein öffentlicher REST-API-Endpunkt vorhanden, mit dem Sie spezielle Standortrahmen abfragen können:
```text
http://spatial.virtualearth.net/REST/v1/data/
```
Geben Sie die folgenden Parameter an:
* **Datenquellen-ID** und **Datenquellenname**: In der Bing Maps-API enthalten Datenquellen Metadaten in verschiedenen „Buckets“, z. B. Standorte und Geschäftszeiten.
* **Entitätsname** : Gibt die Entität an, die Sie als Referenzpunkt für die Benachrichtigung verwenden möchten.
* **Bing Maps-API-Schlüssel**: Dies ist der Schlüssel, den Sie beim Erstellen des Bing Dev Center-Kontos abgerufen haben.
Da die Datenquelle jetzt bereitsteht, können Sie mit der Arbeit an der UWP-Anwendung beginnen.
2. Aktivieren Sie Standortdienste für Ihre Anwendung. Öffnen Sie im **Projektmappen-Explorer** die Datei `Package.appxmanifest`.

3. Wechseln Sie auf der Registerkarte mit den Paketeigenschaften, die geöffnet wird, zu **Funktionen**, und wählen Sie **Standort** aus.

4. Erstellen Sie in Ihrer Projektmappe einen neuen Ordner mit dem Namen `Core` und fügen diesem eine neue Datei namens `LocationHelper.cs` hinzu:

In der `LocationHelper`-Klasse ist Code vorhanden, um den Benutzerstandort über die System-API abzurufen:
```csharp
using System;
using System.Threading.Tasks;
using Windows.Devices.Geolocation;
namespace NotificationHubs.Geofence.Core
{
public class LocationHelper
{
private static readonly uint AppDesiredAccuracyInMeters = 10;
public async static Task<Geoposition> GetCurrentLocation()
{
var accessStatus = await Geolocator.RequestAccessAsync();
switch (accessStatus)
{
case GeolocationAccessStatus.Allowed:
{
Geolocator geolocator = new Geolocator { DesiredAccuracyInMeters = AppDesiredAccuracyInMeters };
return await geolocator.GetGeopositionAsync();
}
default:
{
return null;
}
}
}
}
}
```
Weitere Informationen zum Abrufen des Standorts von Benutzern in UWP-Apps finden Sie unter [Abrufen der Position eines Benutzers](https://msdn.microsoft.com/library/windows/apps/mt219698.aspx).
5. Um zu überprüfen, ob die Standorterfassung funktioniert, öffnen Sie die Codeseite der Hauptseite (`MainPage.xaml.cs`). Erstellen Sie einen neuen Ereignishandler für das `Loaded`-Ereignis im `MainPage`-Konstruktor.
```csharp
public MainPage()
{
this.InitializeComponent();
this.Loaded += MainPage_Loaded;
}
```
Die Implementierung des Ereignishandlers lautet wie folgt:
```csharp
private async void MainPage_Loaded(object sender, RoutedEventArgs e)
{
var location = await LocationHelper.GetCurrentLocation();
if (location != null)
{
Debug.WriteLine(string.Concat(location.Coordinate.Longitude,
" ", location.Coordinate.Latitude));
}
}
```
6. Führen Sie die Anwendung aus, und gewähren Sie ihr Zugriff auf Ihren Standort.

7. Nach dem Starten der Anwendung sollten die Koordinaten im Fenster **Ausgabe** angezeigt werden:

Jetzt wissen Sie, dass die Standorterfassung funktioniert. Wenn Sie möchten, können Sie den Testereignishandler „Loaded“ nun entfernen, da er nicht mehr benötigt wird.
8. Der nächste Schritt ist das Erfassen der Standortänderungen. Fügen Sie in der `LocationHelper`-Klasse den Ereignishandler für `PositionChanged` hinzu:
```csharp
geolocator.PositionChanged += Geolocator_PositionChanged;
```
Die Implementierung zeigt die Standortkoordinaten im Fenster **Ausgabe** an:
```csharp
private static async void Geolocator_PositionChanged(Geolocator sender, PositionChangedEventArgs args)
{
await CoreApplication.MainView.CoreWindow.Dispatcher.RunAsync(CoreDispatcherPriority.Normal, () =>
{
Debug.WriteLine(string.Concat(args.Position.Coordinate.Longitude, " ", args.Position.Coordinate.Latitude));
});
}
```
## <a name="set-up-the-backend"></a>Einrichten des Back-Ends
1. Laden Sie das [.NET-Back-End-Beispiel von GitHub herunter](https://github.com/Azure/azure-notificationhubs-dotnet/tree/master/Samples/NotifyUsers).
2. Öffnen Sie nach Abschluss des Downloads den Ordner `NotifyUsers`, und öffnen Sie dann die Datei `NotifyUsers.sln` in Visual Studio.
3. Geben Sie das Projekt `AppBackend` mit der Option **Als Startprojekt festlegen** als Startprojekt an, und starten Sie es.

Das Projekt ist bereits zum Senden von Pushbenachrichtigungen an Zielgeräte konfiguriert, sodass Sie nur noch zwei Dinge tun müssen: die richtige Verbindungszeichenfolge für den Notification Hub angeben und eine Grenzidentifizierung hinzufügen, damit die Benachrichtigung nur dann gesendet wird, wenn sich der Benutzer innerhalb des Geofence befindet.
4. Öffnen Sie zum Konfigurieren der Verbindungszeichenfolge im Ordner `Models` die Datei `Notifications.cs`. Die Funktion `NotificationHubClient.CreateClientFromConnectionString` sollte die Informationen zu Ihrem Notification Hub enthalten, die Sie im [Azure-Portal](https://portal.azure.com) abrufen können (Seite **Zugriffsrichtlinien** in den **Einstellungen**). Speichern Sie die aktualisierte Konfigurationsdatei.
5. Erstellen Sie ein Modell für das Bing Maps-API-Ergebnis. Die einfachste Möglichkeit besteht darin, den Ordner `Models` zu öffnen und **Hinzufügen** > **Klasse** auszuwählen. Vergeben Sie den Namen `GeofenceBoundary.cs`. Wenn dies erledigt ist, kopieren Sie die JSON aus der API-Antwort, die Sie im ersten Abschnitt erhalten haben. Verwenden Sie in Visual Studio die Optionen **Bearbeiten** > **Inhalte einfügen** > **JSON als Klassen einfügen**.
Auf diese Weise stellen Sie sicher, dass das Objekt genau wie gewünscht deserialisiert wird. Der resultierende Klassensatz sollte wie die folgende Klasse aussehen:
```csharp
namespace AppBackend.Models
{
public class Rootobject
{
public D d { get; set; }
}
public class D
{
public string __copyright { get; set; }
public Result[] results { get; set; }
}
public class Result
{
public __Metadata __metadata { get; set; }
public string EntityID { get; set; }
public string Name { get; set; }
public float Longitude { get; set; }
public float Latitude { get; set; }
public string Boundary { get; set; }
public string Confidence { get; set; }
public string Locality { get; set; }
public string AddressLine { get; set; }
public string AdminDistrict { get; set; }
public string CountryRegion { get; set; }
public string PostalCode { get; set; }
}
public class __Metadata
{
public string uri { get; set; }
}
}
```
6. Öffnen Sie nun `Controllers` > `NotificationsController.cs`. Aktualisieren Sie den POST-Aufruf, sodass der Längen- und Breitengrad des Ziels einbezogen werden. Fügen Sie der Funktionssignatur zu diesem Zweck zwei Zeichenfolgen hinzu: `latitude` und `longitude`.
```csharp
public async Task<HttpResponseMessage> Post(string pns, [FromBody]string message, string to_tag, string latitude, string longitude)
```
7. Erstellen Sie im Projekt eine neue Klasse mit dem Namen `ApiHelper.cs`. Sie verwenden diese Klasse zum Herstellen der Verbindung mit Bing, um Grenzschnittpunkte zu überprüfen. Implementieren Sie eine `IsPointWithinBounds`-Funktion, wie im folgenden Code veranschaulicht:
```csharp
public class ApiHelper
{
public static readonly string ApiEndpoint = "{YOUR_QUERY_ENDPOINT}?spatialFilter=intersects(%27POINT%20({0}%20{1})%27)&$format=json&key={2}";
public static readonly string ApiKey = "{YOUR_API_KEY}";
public static bool IsPointWithinBounds(string longitude,string latitude)
{
var json = new WebClient().DownloadString(string.Format(ApiEndpoint, longitude, latitude, ApiKey));
var result = JsonConvert.DeserializeObject<Rootobject>(json);
if (result.d.results != null && result.d.results.Count() > 0)
{
return true;
}
else
{
return false;
}
}
}
```
> [!IMPORTANT]
> Stellen Sie sicher, dass Sie den API-Endpunkt durch die Abfrage-URL ersetzen, die Sie weiter oben aus Bing Dev Center abgerufen haben (dasselbe gilt für den API-Schlüssel).
Wenn die Abfrage zu Ergebnissen führt, bedeutet dies, dass sich der angegebene Punkt innerhalb der Grenzen des Geofence befindet. Die Funktion gibt also `true` zurück. Wenn keine Ergebnisse vorliegen, teilt Bing Ihnen mit, dass sich der Punkt außerhalb des Suchbereichs befindet. Die Funktion gibt also `false` zurück.
8. Erstellen Sie in `NotificationsController.cs` direkt vor der switch-Anweisung eine Überprüfung:
```csharp
if (ApiHelper.IsPointWithinBounds(longitude, latitude))
{
switch (pns.ToLower())
{
case "wns":
//// Windows 8.1 / Windows Phone 8.1
var toast = @"<toast><visual><binding template=""ToastText01""><text id=""1"">" +
"From " + user + ": " + message + "</text></binding></visual></toast>";
outcome = await Notifications.Instance.Hub.SendWindowsNativeNotificationAsync(toast, userTag);
// Windows 10 specific Action Center support
toast = @"<toast><visual><binding template=""ToastGeneric""><text id=""1"">" +
"From " + user + ": " + message + "</text></binding></visual></toast>";
outcome = await Notifications.Instance.Hub.SendWindowsNativeNotificationAsync(toast, userTag);
break;
}
}
```
## <a name="test-push-notifications-in-the-uwp-app"></a>Testen von Pushbenachrichtigungen in der UWP-App
1. Sie sollten in der UWP-App jetzt Benachrichtigungen testen können. Erstellen Sie in der `LocationHelper`-Klasse die neue Funktion `SendLocationToBackend`:
```csharp
public static async Task SendLocationToBackend(string pns, string userTag, string message, string latitude, string longitude)
{
var POST_URL = "http://localhost:8741/api/notifications?pns=" +
pns + "&to_tag=" + userTag + "&latitude=" + latitude + "&longitude=" + longitude;
using (var httpClient = new HttpClient())
{
try
{
await httpClient.PostAsync(POST_URL, new StringContent("\"" + message + "\"",
System.Text.Encoding.UTF8, "application/json"));
}
catch (Exception ex)
{
Debug.WriteLine(ex.Message);
}
}
}
```
> [!NOTE]
> Legen Sie die `POST_URL` auf den Speicherort Ihrer bereitgestellten Webanwendung fest. Für den Zweck dieses Tutorials ist die lokale Ausführung in Ordnung. Falls Sie aber eine öffentliche Version bereitstellen möchten, müssen Sie diese über einen externen Anbieter hosten.
2. Registrieren Sie die UWP-App für Pushbenachrichtigungen. Wählen Sie in Visual Studio **Projekt** > **Store** > **App mit Store verknüpfen**.

3. Stellen Sie nach dem Anmelden an Ihrem Entwicklerkonto sicher, dass Sie eine vorhandene App auswählen oder eine neue App erstellen und ihr das Paket zuordnen.
4. Wechseln Sie zum Dev Center, und öffnen Sie die erstellte App. Wählen Sie **Dienste** > **Pushbenachrichtigungen** > **Live Services-Website**.

5. Notieren Sie sich von der Website den **geheimen Schlüssel der Anwendung** und die **Paket-SID**. Sie benötigen beide Angaben im Azure-Portal. Öffnen Sie Ihren Notification Hub, wählen Sie **Einstellungen** > **Notification Services** > **Windows (WNS)** aus, und geben Sie die Informationen in die erforderlichen Felder ein.

6. Wählen Sie **Speichern** aus.
7. Öffnen Sie im **Projektmappen-Explorer** die Option **Verweise**, und wählen Sie **NuGet-Pakete verwalten**. Fügen Sie einen Verweis auf die **verwaltete Microsoft Azure Service Bus-Bibliothek** hinzu. Suchen Sie einfach nach `WindowsAzure.Messaging.Managed`, und fügen Sie das Element zum Projekt hinzu.

8. Erstellen Sie den Ereignishandler `MainPage_Loaded` zu Testzwecken noch einmal und fügen diesen Codeausschnitt hinzu:
```csharp
var channel = await PushNotificationChannelManager.CreatePushNotificationChannelForApplicationAsync();
var hub = new NotificationHub("HUB_NAME", "HUB_LISTEN_CONNECTION_STRING");
var result = await hub.RegisterNativeAsync(channel.Uri);
// Displays the registration ID so you know it was successful
if (result.RegistrationId != null)
{
Debug.WriteLine("Reg successful.");
}
```
Der Code registriert die App beim Notification Hub. Sie sind fertig!
9. Im `LocationHelper`-Element im Handler `Geolocator_PositionChanged` können Sie Testcode hinzufügen, der die Festlegung des Standorts innerhalb des Geofence erzwingt:
```csharp
await LocationHelper.SendLocationToBackend("wns", "TEST_USER", "TEST", "37.7746", "-122.3858");
```
10. Da Sie nicht die echten Koordinaten übergeben (die sich derzeit ggf. nicht innerhalb der Grenzen befinden) und vordefinierte Testwerte verwenden, wird beim Aktualisieren eine Benachrichtigung angezeigt:

## <a name="next-steps"></a>Nächste Schritte
Es gibt einige Schritte, deren Ausführung unter Umständen erforderlich ist, um sicherzustellen, dass die Lösung bereit für die Produktion ist.
1. Zunächst sollten Sie sicherstellen, dass die Geofences dynamisch sind. Hierfür ist etwas zusätzliche Arbeit mit der Bing-API erforderlich, um neue Grenzen innerhalb der vorhandenen Datenquelle hochladen zu können. Weitere Informationen finden Sie in der [Dokumentation zur Bing Spatial Data Services-API](https://msdn.microsoft.com/library/ff701734.aspx).
2. Außerdem kann es bei der Sicherstellung, dass die Bereitstellung für die richtigen Teilnehmer erfolgt, ratsam sein, das [Tagging](notification-hubs-tags-segment-push-message.md)zu verwenden.
Die in diesem Tutorial vorgestellte Lösung beschreibt ein Szenario, in dem Sie viele verschiedene Plattformen nutzen können, sodass das Geofencing nicht auf systemspezifische Funktionen beschränkt ist. Die universelle Windows-Plattform verfügt aber standardmäßig über Funktionen zum [Erkennen von Geofences](https://msdn.microsoft.com/windows/uwp/maps-and-location/set-up-a-geofence).
| 61.94 | 448 | 0.734299 | deu_Latn | 0.961432 |
0ca773dea8a8b81c149f7b845c4dfd587601303e | 3,247 | md | Markdown | docs/vue/README.md | BurNing1993/notebook | 382fff3d01453e3ad7f7b23a6587e0ea5fc4dbcd | [
"MIT"
] | null | null | null | docs/vue/README.md | BurNing1993/notebook | 382fff3d01453e3ad7f7b23a6587e0ea5fc4dbcd | [
"MIT"
] | 1 | 2022-03-07T07:01:14.000Z | 2022-03-07T07:01:16.000Z | docs/vue/README.md | joey2217/notebook | af2ba2dc62a8f05b46631a1be019632e2855c890 | [
"MIT"
] | null | null | null | # Vue
## [环境变量](https://cli.vuejs.org/zh/guide/mode-and-env.html)
```sh
.env # 在所有的环境中被载入
.env.production # 只在production模式中被载入
```
::: tip
NODE_ENV - 会是 "development"、"production" 或 "test" 中的一个。具体的值取决于应用运行的模式。
BASE_URL - 会和 vue.config.js 中的 publicPath 选项相符,即你的应用会部署到的基础路径。
:::
::: tip
只有以 VUE*APP* 开头的变量会被 webpack.DefinePlugin 静态嵌入到客户端侧的包中。
如 VUE_APP_SECRET='SECRET'
重启后生效!!!
:::
- 通过传递 --mode 选项参数为命令行覆写默认的模式.
```json
"dev-build": "vue-cli-service build --mode development",
```
## 部署
- 同时部署多个 Vue Web App,在 Home 下部署 Doc
Home(默认配置):
router:
base: process.env.BASE_URL,
mode: 'history',
Docs:
router:
base: process.env.BASE_URL,
mode: 'history',
vue.config.js
publicPath: '/docs/',
Nginx:
server{
listen 80;
server_name localhost;
root /usr/share/nginx/html;
index index.html index.htm;
location / {
try_files $uri $uri/ /index.html;
}
location /docs {
try_files $uri $uri/ /docs/index.html;
}
...
}
::: tip
<http://localhost> -> home
<http://localhost/docs> -> docs
:::
## Prism 代码高亮
1.配置
```sh
yarn add prismjs
yarn add --dev babel-plugin-prismjs
```
```js
// babel.config.js 添加 babel-plugin-prismjs
module.exports = {
presets: ["@vue/app"],
plugins: [
["prismjs", {
"languages": ["javascript", "css", "html"],
"plugins": ["line-numbers", "show-language"],
"theme": "tomorrow", //node_modules/prismjs/themes/prism-*.css
"css": true
}]
]
};
```
2.示例
```html
<template>
<pre class="line-numbers">
<code class='language-js'>
{{code}}
</code>
</pre>
</template>
<script>
import Prism from 'prismjs';
export default {
name: 'Code',
data() {
return {
code:
`const bar=1
const foo='123'`,
};
},
mounted() {
this.$nextTick(() => {
Prism.highlightAll();
});
},
};
</script>
```
## 自定义指令
### darg(拖拽)
```js
Vue.directive('darg', {
//指令绑定到元素
bind(el) {
// el.style.position = "absolute"
el.onmousedown = function (e) {
let disX = e.clientX - el.offsetLeft;
let disY = e.clientY - el.offsetTop;
document.onmousemove = function (e) {
el.style.left = e.clientX - disX + "px";
el.style.top = e.clientY - disY + "px";
};
document.onmouseup = function () {
document.onmousemove = null;
};
return false;
}
}
});
```
### rotate(旋转)
```js
Vue.directive('rotate', {
bind(el) {
el.ondblclick = () => {
let deg = Number(el.dataset.deg) || 0;
deg += 90;
el.dataset.deg = deg
el.style.transform = `rotate(${deg}deg)`;
}
}
});
```
## 插件
### [vue-cli-plugin-webpack-bundle-analyzer](https://www.npmjs.com/package/vue-cli-plugin-webpack-bundle-analyzer)
- Install
```sh
vue add webpack-bundle-analyzer
```
- [Configuration](https://github.com/webpack-contrib/webpack-bundle-analyzer#options-for-plugin)
```js
// vue.config.js
module.exports = {
pluginOptions: {
webpackBundleAnalyzer: {
openAnalyzer: process.env.NODE_ENV === 'production',
}
}
};
```
| 17.551351 | 114 | 0.559593 | yue_Hant | 0.442076 |
0ca800f2e852a9281689c543e2c35cdd94b25167 | 2,683 | md | Markdown | docs/3.0.0-beta.x/concepts/parameters.md | rou9e/strapi | 8305e2e40994a6e4fad0ed8aef76a23615a62b1c | [
"MIT"
] | 4 | 2020-01-09T21:39:45.000Z | 2020-07-11T16:04:17.000Z | docs/3.0.0-beta.x/concepts/parameters.md | huudangdev/strapi | 169145b051351c2f2080ff97dd0a0857f52ec42d | [
"MIT"
] | 1 | 2020-01-18T20:40:29.000Z | 2020-01-18T20:40:29.000Z | docs/3.0.0-beta.x/concepts/parameters.md | huudangdev/strapi | 169145b051351c2f2080ff97dd0a0857f52ec42d | [
"MIT"
] | 1 | 2019-03-14T08:18:06.000Z | 2019-03-14T08:18:06.000Z | # Parameters
## Concept
You can use `strapi-utils` to parse the query params to Strapi's standards filters programmatically if you need it.
## Extracting requests filters
To transform the query params to Strapi's standard filters a request, you can use the `convertRestQueryParams` function from [strapi-utils](../global-strapi/api-reference.md#strapiutils).
```js
const { convertRestQueryParams } = require('strapi-utils');
module.exports = {
// when sending a request like GET /products?_sort:id&id=1
fetchExpensiveProducts: (params, populate) => {
const filters = convertRestQueryParams(params);
/**
* filters = {
* start: 0,
* limit: 10,
* sort: [{ field: 'id', order: 'desc' }],
* where: [
* { field: 'id', operator: 'eq', value: 1 },
* ]
* }
*/
// do sth with them
},
};
```
## Querying data
We added a new API to query data base on the new filters API.
```js
const { convertRestQueryParams, buildQuery } = require('strapi-utils');
module.exports = {
find: async (ctx) => {
// Convert params.
const filter = convertRestQueryParams(ctx.request.query);
return buildQuery({
model: Article
filters,
populate: []
});
};
};
```
### SQL databases (bookshelf)
If you are using a SQL database, calling `buildQuery` will return a [`Bookshelf Query`](https://bookshelfjs.org/api.html) on which you can call other functions (e.g `count`)
### Mongo database
If you are using a mongo database calling `buildQuery` returns either a [`Mongoose Query`](https://mongoosejs.com/docs/api.html#Query) or a custom query when used with deep filtering.
#### Custom Query
When using the deep filtering feature with mongo, we build an aggregation query to avoid too many round-trips with the mongo DB.
Doing that means we don't get a Mongoose object as a response but instead a plain JS Object. This brings a some issues like no virtual fields available and no Mongoose lifecycles.
To deliver the best possible experience, we decided to rehydrate the Mongoose models, forcing us to override the Mongoose query
```js
const query = buildQuery({
model: Product, // you can use any models from strapi.models or strapi.plugins[pluginName].models
filters: { limit: 10 },
populate: [],
});
```
returns a query with the following functions
- `count` => Returns an integer equal to the number of matching entities
- `lean` => Returns the matching elements as Objects
- `then(onSucces, onFailure)` => Calls the onSucces with an array of Mongoose objects.
- `catch(onError)` => Promise catch
- `group(options)` => Calls the aggregation group function of mongoose
| 31.197674 | 187 | 0.694372 | eng_Latn | 0.966983 |
0ca8289e1719f848fb9bd5bd7c59478c56064d72 | 22 | md | Markdown | genesis/docs/changelog/1.3.1.md | twobyte/Webpack-Bootstrap4-Wordpress-Genesis-starter-theme | ea1f87b4780f580fa5b0a4c03f59d181b7c49200 | [
"MIT"
] | 3 | 2019-02-19T19:40:57.000Z | 2019-03-24T01:01:17.000Z | genesis/docs/changelog/1.3.1.md | twobyte/Webpack-Bootstrap4-Wordpress-Genesis-starter-theme | ea1f87b4780f580fa5b0a4c03f59d181b7c49200 | [
"MIT"
] | null | null | null | genesis/docs/changelog/1.3.1.md | twobyte/Webpack-Bootstrap4-Wordpress-Genesis-starter-theme | ea1f87b4780f580fa5b0a4c03f59d181b7c49200 | [
"MIT"
] | 3 | 2018-09-10T15:29:33.000Z | 2019-10-13T04:41:02.000Z | ## 1.3.1 - 2010-09-15
| 11 | 21 | 0.5 | deu_Latn | 0.23244 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.