Dataset Viewer (First 5GB)
id
stringlengths 4
10
| text
stringlengths 4
2.14M
| source
stringclasses 2
values | created
timestamp[us]date 2001-05-16 21:05:09
2025-01-01 03:38:30
| added
stringdate 2025-04-01 04:05:38
2025-04-01 07:14:06
| metadata
dict |
---|---|---|---|---|---|
1762821272 | 🛑 Ombi is down
In b7227c7, Ombi (https://ombi.0121.org) was down:
HTTP code: 502
Response time: 3343 ms
Resolved: Ombi is back up in 8df453d.
| gharchive/issue | 2023-06-19T06:12:12 | 2025-04-01T06:36:38.036057 | {
"authors": [
"012101do1"
],
"repo": "012101do1/upptime",
"url": "https://github.com/012101do1/upptime/issues/129",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2573248952 | 🛑 Nextcloud is down
In c18ffc4, Nextcloud (https://nextcloud.0121.org) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Nextcloud is back up in a30ead4 after 15 minutes.
| gharchive/issue | 2024-10-08T13:40:40 | 2025-04-01T06:36:38.038854 | {
"authors": [
"012101do1"
],
"repo": "012101do1/upptime",
"url": "https://github.com/012101do1/upptime/issues/741",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2365862268 | Update
Fixed
merged
| gharchive/pull-request | 2024-06-21T07:00:03 | 2025-04-01T06:36:38.450355 | {
"authors": [
"0e8",
"JoeSmoePoe"
],
"repo": "0e8/niepogodasreroll",
"url": "https://github.com/0e8/niepogodasreroll/pull/13",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
} |
2087637140 | Field used but still warned as not used
Describe the bug
maxHP is used in line 39 but still reported as not used
Only happens on root query data, works fine in Fragment. However, id is not reported as not used even both PokemonList and PokemonItem do not use.
Reproduction
No response
gql.tada version
gql.tada 1.0.2
Validations
[X] I can confirm that this is a bug report, and not a feature request, RFC, question, or discussion, for which GitHub Discussions should be used
[X] Read the docs.
[X] Follow our Code of Conduct
@JoviDeCroock using LSP version 1.0.0 when creating this issue but still happens in 1.0.3
That does not happen to me for latest, when you upgrade the LSP you have to restart your TSServer.
I did however discover a different bug where overlapping fields can become a nuisance, fixing that in https://github.com/0no-co/GraphQLSP/pull/182
can confirm it's fixed in 1.0.5
this is not fixed though:
However, id is not reported as not used even both PokemonList and PokemonItem do not use.
@deathemperor id and __typename are a reserved field for normalised caches and such
@JoviDeCroock updated my previous comment. issue should still remains open.
That link reproduces nothing for me 😅
my bad. that repo doesn't reproduce. our production repo does. Our take more look at it
| gharchive/issue | 2024-01-18T04:32:44 | 2025-04-01T06:36:38.467025 | {
"authors": [
"JoviDeCroock",
"deathemperor"
],
"repo": "0no-co/GraphQLSP",
"url": "https://github.com/0no-co/GraphQLSP/issues/180",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2217097278 | 🛑 UC3 is down
In 35f969f, UC3 (https://uc3.unej.ac.id) was down:
HTTP code: 0
Response time: 0 ms
Resolved: UC3 is back up in 2b146d0 after 1 hour, 50 minutes.
| gharchive/issue | 2024-03-31T17:21:59 | 2025-04-01T06:36:38.469707 | {
"authors": [
"0rangebananaspy"
],
"repo": "0rangebananaspy/io",
"url": "https://github.com/0rangebananaspy/io/issues/3358",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1598576771 | Desktop version 1
In this pr I did the following :
Created a new desktop version of my personal portfolio that was originally in mobile version using a media query.
I made sure the site is responsive by adding a break-point at 768px.
@NduatiKagiri Buttons on my end seem to be aligned center kindly may I know at what resolution do they misbehave?
| gharchive/pull-request | 2023-02-24T12:53:25 | 2025-04-01T06:36:38.477226 | {
"authors": [
"0sugo"
],
"repo": "0sugo/portfolio_mobile_view",
"url": "https://github.com/0sugo/portfolio_mobile_view/pull/5",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1505589195 | Delete obsolete branches
The repo currently has 8 branches and it seems like at least 3 of them are obsolete. Ideally, we should be deleting branches as soon as the related PR is merged.
done
| gharchive/issue | 2022-12-21T01:58:45 | 2025-04-01T06:36:38.490549 | {
"authors": [
"Dominik1999",
"bobbinth"
],
"repo": "0xPolygonMiden/examples",
"url": "https://github.com/0xPolygonMiden/examples/issues/37",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2617390616 | Support ability to specify advice data via MASM
In some situations it maybe desirable to specify some data which a given program assumes to be available in the advice provider. One example of this is read-only data output by the compiler, but there could be may other examples. Currently, such data needs to be loaded separately into the VM which introduces extra complexities.
One way around this is to allow users to define data which is to be loaded into the advice provider before a given program starts executing. The syntax for this in MASM could look like so:
advent.FOO.0x9dfb1fc9f2d5625a5a304b9012b4de14df5cf6e0155cdd63a27c25360562587a
642174d286a4f38e4d2e09a79d048fe7c89dec9a03fce29cbe10d32aa18a1dc4
bb48645fa4ffe141f9a139aef4fa98aec50bded67d45a29e545e386b79d8cefe
0f87d6b3c174fad0099f7296ded3abfef1a282567c4182b925abd69b0ed487c3
c251ce5e4e2da760658f29f6a8c54788d52ae749afd1aef6531bf1457b8ea5fb
end
Here, advent specifies that we want to add an entry to the advice map. The key for the entry would be the word defined by 0x9dfb1fc9f2d5625a... value. The data of the entry would be the list of field elements defined by the hex encoded string. We also provide a way to specify a label FOO by which they key can be referred to from the code. For example:
being
push.FOO
end
Would push the key 0x9dfb1fc9f2d5625a... onto the stack.
Upon assembly, this data would be added to the MastForest. For this, we'd need to add a single AdviceMap property to the MastForest struct - e.g., something like this:
pub struct MastForest {
/// All of the nodes local to the trees comprising the MAST forest.
nodes: Vec<MastNode>,
/// Roots of procedures defined within this MAST forest.
roots: Vec<MastNodeId>,
/// All the decorators included in the MAST forest.
decorators: Vec<Decorator>,
/// Advice map to be loaded into the VM prior to executing procedures from this MAST forest.
advice_map: AdviceMap,
}
Then, when the VM starts executing a given MAST forest, it'll copy the contents of the advice map into its advice provider (we can also use a slightly more sophisticated strategy to make sure that the map is copied only once).
Open questions
While the above approach should work, there are a few things we need to clarify before implementing it:
In the above example FOO refers to a full word. All our constants currently refer to single elements. Ideally, we should be able to tell by looking at the constant name whether it is for a full word or a single element. So, maybe we should come up with some simple scheme here to differentiate them?
Should the key handle FOO be accessible outside of the module it was defined in? It seems like it would be a good idea, but then we need to be able to apply some kind of visibility modifiers to advent.
How should we handle conflicting keys during assembly and execution?
If we encounter two entries with the same key but different data during assembly, this should probably be an error.
But what to do if we start executing a MAST forest which wants to load data into the advice provider but an entry with the same key but different data is already in the advice map? Should we error out? Silently replace the existing data with the new one? Anything else?
...
This change is beneficial to #1544 since I was thinking of a way to convey the notion that MastForest(code) requires the rodata loaded into the advice provider before it can be executed.
The MASM-facing part (syntax, parsing, etc.) of the implementation would take me quite a lot of time since I'm not familiar with the code, but the VM-facing part I believe I can do in a reasonable amount of time. If @bitwalker is ok with it, I can take a stab at it.
Upon assembly, this data would be added to the MastForest. For this, we'd need to add a single AdviceMap property to the MastForest struct - e.g., something like this:
pub struct MastForest {
/// All of the nodes local to the trees comprising the MAST forest.
nodes: Vec<MastNode>,
/// Roots of procedures defined within this MAST forest.
roots: Vec<MastNodeId>,
/// All the decorators included in the MAST forest.
decorators: Vec<Decorator>,
/// Advice map to be loaded into the VM prior to executing procedures from this MAST forest.
advice_map: AdviceMap,
}
Then, when the VM starts executing a given MAST forest, it'll copy the contents of the advice map into its advice provider (we can also use a slightly more sophisticated strategy to make sure that the map is copied only once).
I've taken a look and here are my findings on what needs to be done to implement this:
Move the AdviceMap type from processor to core;
Handle the AdviceMap when merging MAST forests (join with other AdviceMaps?);
Serialization/deserialization of the MastForest should handle the AdviceMap as well, but it'll break storing the rodata separately in the Package (roundtrip serialization would not work). We could put rodata in AdviceMap on the compiler side as well and not store it separately in the Package. @bitwalker is it ok?
Open questions
While the above approach should work, there are a few things we need to clarify before implementing it:
How should we handle conflicting keys during assembly and execution?
If we encounter two entries with the same key but different data during assembly, this should probably be an error.
Yes, I think it should be an error. From rodata perspective, the digest is a hash of the data itself, so if the data is different, the digest will be different as well. From the MASM perspective, this might mean key/digest re-use, which does not seem like something a user might want, so failing early is a good thing to do.
But what to do if we start executing a MAST forest which wants to load data into the advice provider but an entry with the same key but different data is already in the advice map? Should we error out? Silently replace the existing data with the new one? Anything else?
If the user code treats the advice provider as a some sort of dictionary, that's a valid use case. I'm not sure if it should be an error.
Handle the AdviceMap when merging MAST forests (join with other AdviceMaps?);
Yes, I think merging would work fine here. If there is a conflict (two entries with the same key by different data), we'd error out here as well.
Serialization/deserialization of the MastForest should handle the AdviceMap as well, but it'll break storing the rodata separately in the Package (roundtrip serialization would not work). We could put rodata in AdviceMap on the compiler side as well and not store it separately in the Package. @bitwalker is it ok?
Yeah - I think once we have this support for advice map entries in MastForest, there is no need to store rodata separately in the package.
In the above example FOO refers to a full word. All our constants currently refer to single elements. Ideally, we should be able to tell by looking at the constant name whether it is for a full word or a single element. So, maybe we should come up with some simple scheme here to differentiate them?
The parser already knows how to parse various sizes of constants, including single words, or even arbitrarily large data (the size of the data itself indicates which type it is).
Should the key handle FOO be accessible outside of the module it was defined in? It seems like it would be a good idea, but then we need to be able to apply some kind of visibility modifiers to advent.
These would be effectively globally visible symbols, and while unlikely, you can have conflicting keys, so I think any attempt to make it seem like these can be scoped should be avoided.
How should we handle conflicting keys during assembly and execution?
I'm not sure how we handle this during execution today actually, presumably we just clobber the data if two things are loaded with the same key into the advice map?
During assembly I think it has to be an error. It might be possible to skip the error if the data is the same, I think it's still an open question whether or not you would want to know about the conflicting key regardless.
I'm questioning a bit whether it makes sense to define this stuff in Miden Assembly;
I think the keyword has to be something more readable, advent - even knowing what it is supposed to be - still took me a second to figure out what it meant. Personally, I'd choose something more like advice.init or adv_map.init or something.
I've taken a look and here are my findings on what needs to be done to implement this:
Move the AdviceMap type from processor to core;
Handle the AdviceMap when merging MAST forests (join with other AdviceMaps?);
We'll need to catch conflicting keys (different values for the same key, but fine if the keys overlap with the same value), but a straight merge of the two maps should be fine otherwise.
Serialization/deserialization of the MastForest should handle the AdviceMap as well, but it'll break storing the rodata separately in the Package (roundtrip serialization would not work). We could put rodata in AdviceMap on the compiler side as well and not store it separately in the Package. @bitwalker is it ok?
Once we can write our rodata to the MastForest directly, we won't need to do it in the Package anymore, so that sounds fine to me!
@greenhat For now, I would focus purely on the implementation around the MastForest/processor (what you've suggested AIUI), don't worry about the AST at all. That's all we need for the compiler anyway, while we figure out how to handle the frontend aspect in the meantime.
| gharchive/issue | 2024-10-28T06:00:56 | 2025-04-01T06:36:38.509541 | {
"authors": [
"bitwalker",
"bobbinth",
"greenhat"
],
"repo": "0xPolygonMiden/miden-vm",
"url": "https://github.com/0xPolygonMiden/miden-vm/issues/1547",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
957262684 | 🛑 Realbooru is down
In 1d5fd94, Realbooru (https://realbooru.com/index.php?page=dapi&s=post&q=index&limit=5&tags=slave) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Realbooru is back up in aba2471.
| gharchive/issue | 2021-07-31T15:08:15 | 2025-04-01T06:36:38.523057 | {
"authors": [
"0xb0y"
],
"repo": "0xb0y/status",
"url": "https://github.com/0xb0y/status/issues/17",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1145320840 | Missing LICENSE
I see you have no LICENSE file for this project. The default is copyright.
I would suggest releasing the code under the GPL-3.0-or-later or AGPL-3.0-or-later license so that others are encouraged to contribute changes back to your project.
License file added
| gharchive/issue | 2022-02-21T04:52:51 | 2025-04-01T06:36:38.529513 | {
"authors": [
"0xdanelia",
"TechnologyClassroom"
],
"repo": "0xdanelia/rxr",
"url": "https://github.com/0xdanelia/rxr/issues/1",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
129988984 | How to trigger a error message for a particular field with custom message?
I have a sign up form and it has a field txtSignUpUsername, in the blur of txtSignUpUsername im checking whether the username is already exist or not. if exist i need to trigger a error only for the field txtSignUpUsername, "user name already exist please choose another user name".
My script is
var $txtUserName="";
// prepare the form when the DOM is ready
$(document).ready(function() {
$txtUserName=$("#txtSignUpUsername");
$txtUserName.on("blur",function()
{
CheckUserName();
});
});
function CheckUserName()
{
var userNameEntered=$txtUserName.val();
if($.trim(userNameEntered)!="")
{
$.ajax({
type: "POST",
url: "actions/checkusername.php",
dataType: "json",
data: { username:userNameEntered },
success: function(data) {
if(data.result=="no")
{
//raise error for username field only
}
},
error:function(data)
{
alert("error");
}
});
}
}
My form is
<form id="frmSignUp" class="form-horizontal mw-form" data-toggle="validator" data-disable="false">
<div class="col-md-12">
<div class="form-group">
<div class="mw-sideHead">
Sign Up
</div>
</div>
</div>
<!-- Text input-->
<div class="form-group">
<div class="col-md-12">
<label class="control-label" for="textinput">Full Name</label>
<input id="txtSignUpName" name="textinput" placeholder="Full Name" class="form-control" type="text" required="" data-error="Full name required" maxlength="50">
<p class="help-block with-errors"></p>
</div>
</div>
<!-- Text input-->
<div class="form-group">
<div class="col-md-12">
<label class="control-label" for="textinput">Username</label>
<input id="txtSignUpUsername" name="textinput" placeholder="Username" class=" form-control" type="text" required="" data-error="Username required" maxlength="30">
<p class="help-block with-errors">
</p>
</div>
</div>
<!-- Text input-->
<div class="form-group">
<div class="col-md-12">
<label class="control-label" for="textinput">Email ID</label>
<input id="txtSignUpEmailID" name="textinput" placeholder="Email ID" class=" form-control" type="text" required="" data-error="Email id required" maxlength="100" >
<p class="help-block with-errors"></p>
</div>
</div>
<!-- Text input-->
<div class="form-group">
<div class="col-md-12">
<label class="control-label" for="textinput">Password</label>
<input id="txtSignUpPassword" name="" placeholder="Password" class="form-control" type="password" required="" data-error="Password required" maxlength="20">
<p class="help-block with-errors"></p>
</div>
</div>
<!-- Text input-->
<div class="form-group">
<div class="col-md-12">
<label class="control-label" for="textinput">Confirm Password</label>
<input id="txtSignUpConfirmPassword" name="textinput" placeholder="Confirm Password" class=" form-control" type="password" data-match="#txtSignUpPassword" data-match-error="Password does not match !" required="" data-error="Confirm password required" maxlength="20">
<p class="help-block with-errors"></p>
</div>
</div>
<br />
<!-- Button -->
<div class="form-group">
<div class="col-md-12">
<button id="btnSubmitSignUp" type="submit" name="singlebutton" class="btn btn-info pull-right">Sign Up</button>
</div>
</div>
</form>
I hope you understood my requirement? and tell me the solution
Since this isn't an issue with the plugin, please post this question on StackOverflow instead.
| gharchive/issue | 2016-01-30T13:56:16 | 2025-04-01T06:36:38.535529 | {
"authors": [
"1000hz",
"rsmmukesh"
],
"repo": "1000hz/bootstrap-validator",
"url": "https://github.com/1000hz/bootstrap-validator/issues/263",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
936445338 | [bug] dll version not injecting to client since 0.2.3
The dll version does not inject to the client since 0.2.3. Only the workshop version works after that which restricts its use to singleplayer unless you can convince a server owner to add it.
Not a bug, however injection can be re-added easily as a backup.
Added in e0baa6f1658b192cd4d0a9d9edf3e6f92dd2b143
| gharchive/issue | 2021-07-04T12:19:15 | 2025-04-01T06:36:38.539633 | {
"authors": [
"100PXSquared",
"Jacbo1"
],
"repo": "100PXSquared/VisTrace",
"url": "https://github.com/100PXSquared/VisTrace/issues/11",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
733940829 | Add sites: EMBl.org and Visual Framework
This PR adds two projects:
EMBL.org laboratory website
The code for this is linked, but is access restricted due to "internal policy". Not sure if a public codebase is a requirement?
The Visual Framework
An in-development component library, which is sponsored by the EMBL.org project
Thanks for maintaining the dashboard, it's really neat!
Thank you!
| gharchive/pull-request | 2020-11-01T11:16:17 | 2025-04-01T06:36:38.643103 | {
"authors": [
"khawkins98",
"zachleat"
],
"repo": "11ty/11ty-website",
"url": "https://github.com/11ty/11ty-website/pull/833",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2409074688 | JSX and TSX
Hey Zach, looking for any contributors? I wouldn't mind pitching in to add JSX and TSX and languages, as well as general maintenance. I could also help organize a group that pitches in.
I think JSX and TSX should be separate as TSX adds some TypeScript pushups that will frustrate people.
Yeah, absolutely! Be awesome to simplify the docs on these pages (or at least provide simplified instructions): https://www.11ty.dev/docs/languages/jsx/ https://www.11ty.dev/docs/languages/typescript/
@zachleat Here's a PR in a fork with tests and writeup.
I propose I toot about it, you quote-toot it it, and see if we can get some non-Zach eyeballs on it.
Once settled, merged, and released, I can do a PR for docs changes.
Why not PR it to this repository?
@uncenter I did, yesterday.
Did you mean to open two duplicate PRs?
I did, I wanted to treat JSX as different from TSX. The latter has a bit more ceremony (and Zach is doing TS stuff ATM.)
If you'd prefer, I can combine them.
| gharchive/issue | 2024-07-15T15:50:19 | 2025-04-01T06:36:38.647899 | {
"authors": [
"pauleveritt",
"uncenter",
"zachleat"
],
"repo": "11ty/eleventy-plugin-template-languages",
"url": "https://github.com/11ty/eleventy-plugin-template-languages/issues/12",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
114734651 | Demo中简单文本显示异常
在xcode7.1,iOS9模拟器上,简单文本显示的例子中,label1只显示到"面对着汹涌而来的现实,觉..."这里,后面的文本被截断了.
。。。
label1.lineBreakMode = kCTLineBreakByTruncatingTail;
label1.numberOfLines = 3;
| gharchive/issue | 2015-11-03T03:15:20 | 2025-04-01T06:36:38.662403 | {
"authors": [
"12207480",
"huangxianhui001"
],
"repo": "12207480/TYAttributedLabel",
"url": "https://github.com/12207480/TYAttributedLabel/issues/19",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
2330121844 | Скласти таймлайн
Скласти список всіх (важливих) життєвих подій, визначних дат, досягнень, періодів, що дали приємний і неприємний досвід, тощо. В цьому конкретному випадку довгих списків не буває, натомість бувають погано згорнуті, тому пакуємо сюди все підряд, а про “лишні” події можна буде промовчати пізніше. Список повинен бути систематизованим і посортованим в такий спосіб, щоб читач швидко зрозумів “паттерн” і міг легко орієнтуватись в просторі і часі цього резюме.
Складаємо таймлайн.
| gharchive/issue | 2024-06-03T04:08:44 | 2025-04-01T06:36:38.667749 | {
"authors": [
"123shchrbn456"
],
"repo": "123shchrbn456/homepage",
"url": "https://github.com/123shchrbn456/homepage/issues/26",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2637063734 | 🛑 Torrent is down
In 3aca3b5, Torrent (https://torrent.joshuastock.net) was down:
HTTP code: 502
Response time: 168 ms
Resolved: Torrent is back up in 942d048 after 52 minutes.
| gharchive/issue | 2024-11-06T05:17:51 | 2025-04-01T06:36:38.696926 | {
"authors": [
"1337Nerd"
],
"repo": "1337Nerd/uptime",
"url": "https://github.com/1337Nerd/uptime/issues/567",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1957661763 | Request: ability to disable last modified timestamp
I see this is part of single.html in v1.0.2, where it has an if .Lastmod conditional.
Any chance we can expose a new config parameter to turn off this behavior? I like the default "creation only" timestamp. Alternately, maybe a post front matter parameter could be useful too
Hi @jamesbraza,
If you dont supply Lastmod in posts' front matter, it will not show Modified section.
Le me know if you were able to resolve.
For me, my front matter is (it has no Lastmod):
---
title: "Foo"
date: 2023-04-23
draft: false
tags: ["article"]
---
Even explicitly adding Lastmod: false did not disable the "Modified" text. What am I missing?
Hi again,
It seems to be very strange indeed.
In this post, I have supplied Lastmod as follows.
title: "Typography"
slug : 'typography'
date: 2023-07-22T14:36:33+05:30
draft: false
featuredImg: ""
description : 'Integer lobortis vulputate mauris quis maximus. Vestibulum ac eros porttitor, auctor sem sed, tincidunt nulla. In sit amet tincidunt ex.'
tags:
- Demo
- Typography
Lastmod : 2023-08-15T15:36:33+05:30
I get :
Now if I remove:
title: "Typography"
slug : 'typography'
date: 2023-07-22T14:36:33+05:30
draft: false
featuredImg: ""
description : 'Integer lobortis vulputate mauris quis maximus. Vestibulum ac eros porttitor, auctor sem sed, tincidunt nulla. In sit amet tincidunt ex.'
tags:
- Demo
- Typography
I don't get the Lastmod section.
Pleas re-check your config if everything is properly formatted and/or do a full recheck on all the files if Lastmod is given somewhere.
Lol I am trying to figure it out, it's got me stumped too
One question, elsewhere I see .Params.tags, but with .Lastmod it's not using Params. Why is it not .Params.Lastmod?
Ahh I see. I have enableGitInfo = true in my config, which seems to globally enable .Lastmod: https://github.com/1bl4z3r/hermit-V2/blob/1ad173d2ab6817d7ca033b28b507df5ba8e08be6/hugo.toml#L32
Any chance you can add in some capability to disable enableGitInfo opting pages into .Lastmod?
To properly explain your previous query,
There were some changes in Hugo where while defining custom local page variables (i.e. Page Variables whose scope is within the page itself), we can ignore .Params as it is implied that we are trying to fetch local page variables. You can definitely put in .Params.Lastmod and the output would be exactly the same.
It was something to differentiate from inbuilt Page variables and custom Page variables. I am still unsure if we can access custom Page variables from other pages or not.
For enableGitInfo, I am quite unsure how to properly implement this, so that it would not break the core functionality.
Ohkay, here's a big brain moment.
What can be done is in each page a new Page Variable could be setup, whose only job in the world is to enable/disable [Modified:] section. It won't matter if .Lastmod should be shown or not for the post.
If .GitInfo is true and .LastmodEnabler is false, Modified section is not shown
If .GitInfo is true and .LastmodEnabler is true, Modified section is shown, Date fetched from git
If .GitInfo is false and .LastmodEnabler is true, Modified section is shown, with each page having a dedicated .Lastmod
If .GitInfo is false and .LastmodEnabler is false, Modified section is not shown
if Page.Lastmodenabler
{
}
Let me know if you want this to be implemented.
I like what you propose! It:
Is simple and intuitive
Allows for both global configuration and per-page configuration
Solves my problem here too haha
Sounds good to me
Cool cool cool
It shouldn't take me more than 1 business day to implement.
Okay sound good, ahha no need to provide business days here, it's FOSS babyyyy
Implemented. Same is updated on #last-modified-date
If IgnoreLastmod is not provided or IgnoreLastmod=false, then:
If enableGitInfo = true, then Git Hash will be shown in [...] after Date.
If enableGitInfo = false, then:
If Lastmod is not provided or Lastmod has same value as Date, error will be thrown.
If Lastmod is provided or Lastmod is different from Date, value of Lastmod will be displayed in [...] after Date.
Closing this issue. Re-open if required.
Not Fixed yet
This is finalfinalfinal. As usual, details updated in #last-modified-date
If ShowLastmod:true :
If enableGitInfo = true, then Git Hash will be shown in [...] after Date.
If enableGitInfo = false, then:
If Lastmod is not provided or Lastmod has same value as Date, error will be thrown.
If Lastmod is provided or Lastmod is different from Date, value of Lastmod will be displayed in [...] after Date.
If ShowLastmod is not provided. User response defaults to false. It is equivalent to providing ShowLastmod:false.
And I was wrong.
Any Page Variable should be called via .Page.Params. if you ignore .Page or .Site, . by default has Global scope.
.Lastmod is inbuilt Hugo Variable attached with GitInfo, hence it has global scope.
| gharchive/issue | 2023-10-23T17:30:39 | 2025-04-01T06:36:38.944891 | {
"authors": [
"1bl4z3r",
"jamesbraza"
],
"repo": "1bl4z3r/hermit-V2",
"url": "https://github.com/1bl4z3r/hermit-V2/issues/30",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
910358453 | U| EventLoop
Summary
Context
delete master branch
| gharchive/pull-request | 2021-06-03T10:35:08 | 2025-04-01T06:36:38.981682 | {
"authors": [
"2014100890"
],
"repo": "2014100890/2014100890.github.io",
"url": "https://github.com/2014100890/2014100890.github.io/pull/1",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1167211980 | trainer class is None when training with my own data
how to solve this problem?
Please make sure the trainer file exists in the path:
nnFormer/training/network_training/
Another possible reason is that the trainer file exists in the above path, but the Class name in the trainer is not same with the trainer file
do you run nnformer on your own dataset successfully?
what does that mean?I use my own data only have foreground and background
I ran python inference_synapse.py
but I got this
open the dice_pre.txt
I got
Is the number of classs not set properly
so what should I do
Please make sure the trainer file exists in the path:
nnFormer/training/network_training/
Another possible reason is that the trainer file exists in the above path, but the Class name in the trainer is not same with the trainer file
| gharchive/issue | 2022-03-12T07:45:25 | 2025-04-01T06:36:39.022748 | {
"authors": [
"282857341",
"jjhHan",
"jjhhan",
"pcl1121",
"puppy2000"
],
"repo": "282857341/nnFormer",
"url": "https://github.com/282857341/nnFormer/issues/57",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
246415595 | Not working - JS Error
Uncaught TypeError: Cannot read property 'trim' of undefined
at trim (Util.js:121)
at splitWords (Util.js:127)
at NewClass.on (Events.js:49)
at NewClass.initialize (leaflet.markercluster-src.js:51)
at new NewClass (Class.js:22)
at Object.L.markerClusterGroup (leaflet.markercluster-src.js:1080)
at map_init (map:678)
at HTMLDocument.<anonymous> (map:683)
at fire (jquery.js:3187)
at Object.fireWith [as resolveWith] (jquery.js:3317)
Followed the documentation on readme, it looks like outdated tho
Same error after install
@nfacha @kxxxo Big apologies... I do not have much time to fix the bug my self right now, but I'll fix it as soon as i can.
Same error...
I need to change
$cluster = new MarkerCluster([
'jsonUrl' => Yii::$app->controller->createUrl('projects/json')
]);
for this:
$cluster = new MarkerCluster([
'url' => Yii::$app->urlManager->createUrl('projects/json')
]);
Because jsonUrl isn't throws this exception:
Setting unknown property: dosamigos\leaflet\plugins\markercluster\MarkerCluster::jsonUrl
and
Yii::$app->controller->createUrl('projects/json') throws: Calling unknown method: app\controllers\ProjectsController::createUrl()
but the error (Uncaught TypeError: Cannot read property 'trim' of undefined) don't fixed
| gharchive/issue | 2017-07-28T17:48:42 | 2025-04-01T06:36:39.035846 | {
"authors": [
"asilvestre87",
"kxxxo",
"nfacha",
"tonydspaniard"
],
"repo": "2amigos/yii2-leaflet-markercluster-plugin",
"url": "https://github.com/2amigos/yii2-leaflet-markercluster-plugin/issues/3",
"license": "bsd-3-clause",
"license_type": "permissive",
"license_source": "bigquery"
} |
1049144011 | Temporarily disable singleuser networkpolicy for Pangeo
Temporarily mitigates issue reported in https://discourse.pangeo.io/t/trying-to-open-gateway-cluster/1912 but we should come up with a better fix
This is a really weird error that I don't get
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
deployer/tests/test_hub_health.py:69: in test_hub_healthy
await check_hub_health(hub_url, test_notebook_path, api_token)
deployer/tests/test_hub_health.py:43: in check_hub_health
await execute_notebook(
/usr/local/Caskroom/miniconda/base/envs/pilot-hubs/lib/python3.8/site-packages/jhub_client/execute.py:115: in execute_notebook
return await execute_code(hub_url, cells, **kwargs)
/usr/local/Caskroom/miniconda/base/envs/pilot-hubs/lib/python3.8/site-packages/jhub_client/execute.py:94: in execute_code
logger.debug(f'kernel result cell={i} result=\n{textwrap.indent(kernel_result, " | ")}')
/usr/local/Caskroom/miniconda/base/envs/pilot-hubs/lib/python3.8/textwrap.py:480: in indent
return ''.join(prefixed_lines())
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
def prefixed_lines():
> for line in text.splitlines(True):
E AttributeError: 'NoneType' object has no attribute 'splitlines'
/usr/local/Caskroom/miniconda/base/envs/pilot-hubs/lib/python3.8/textwrap.py:478: AttributeError
--------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------
Starting hub https://staging.us-central1-b.gcp.pangeo.io health validation...
Running dask_test_notebook.ipynb test notebook...
Hub https://staging.us-central1-b.gcp.pangeo.io not healthy! Stopping further deployments. Exception was 'NoneType' object has no attribute 'splitlines'.
I'm going to deploy manually and skip the tests for now. The actually config update was applied successfully and worked as expected when I tested on staging.
| gharchive/pull-request | 2021-11-09T21:36:15 | 2025-04-01T06:36:39.124917 | {
"authors": [
"sgibson91"
],
"repo": "2i2c-org/infrastructure",
"url": "https://github.com/2i2c-org/infrastructure/pull/819",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
} |
1013466571 | Recovering PR #706: Supporting native JupyterHub OAuthenticator alongside Auth0 for our hubs
Summary
This PR is a reconstruction of PR #706 which failed on merge due to JSON schema validation errors. In addition to reconstructing that, this PR also aims to address the JSON schema validation errors and extends the validate() function in deployer to also validate the secret config, if it exists.
fixes https://github.com/2i2c-org/pilot-hubs/issues/625
Changes to config/hubs/schema.yaml
I have borrowed a pattern from @damianavila to make the auth0.connection property conditional on auth0.enabled which is now a fully fledged property of auth0.
I have also set the default value of auth0.enabled to be true so that we don't have to go through every *.cluster.yaml file and add the enabled key for auth0.
I have deployed this to Pangeo staging (partnered with #707) and it works! So marking this as ready for review :)
Just a quick question - is this basically just the same PR as before, but now the original bug that we tripped has been fixed? If so, and if you've already tested it out, I'd be +1 on merging unless you want fresh eyes on any new stuff in particular, since the last PR already had some approves
I've updated the comment @yuvipanda pointed out, so I will merge once tests pass :)
| gharchive/pull-request | 2021-10-01T15:13:20 | 2025-04-01T06:36:39.129369 | {
"authors": [
"choldgraf",
"sgibson91"
],
"repo": "2i2c-org/pilot-hubs",
"url": "https://github.com/2i2c-org/pilot-hubs/pull/726",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
} |
1957875209 | 🛑 MS - SYNC is down
In 2fb45b3, MS - SYNC ($MS_SYNC_URL) was down:
HTTP code: 404
Response time: 217 ms
Resolved: MS - SYNC is back up in 668e5fa after 8 minutes.
| gharchive/issue | 2023-10-23T19:24:23 | 2025-04-01T06:36:39.139496 | {
"authors": [
"DanielVelasquezTreinta"
],
"repo": "30SAS/uptime",
"url": "https://github.com/30SAS/uptime/issues/1065",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2302951575 | 🛑 WEB - web.treinta.co is down
In 3abf7af, WEB - web.treinta.co ($WEB_WEB_TREINTA_CO_URL) was down:
HTTP code: 0
Response time: 0 ms
Resolved: WEB - web.treinta.co is back up in 099049d after 7 minutes.
| gharchive/issue | 2024-05-17T14:51:16 | 2025-04-01T06:36:39.142910 | {
"authors": [
"DanielVelasquezTreinta"
],
"repo": "30SAS/uptime",
"url": "https://github.com/30SAS/uptime/issues/2540",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2313219897 | 🛑 MS - B2B is down
In 57c655a, MS - B2B ($MS_B2B_URL) was down:
HTTP code: 0
Response time: 0 ms
Resolved: MS - B2B is back up in 248cd3c after 6 minutes.
| gharchive/issue | 2024-05-23T15:29:15 | 2025-04-01T06:36:39.145280 | {
"authors": [
"DanielVelasquezTreinta"
],
"repo": "30SAS/uptime",
"url": "https://github.com/30SAS/uptime/issues/2693",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2490533988 | 🛑 MS - B2B is down
In 0f13af6, MS - B2B ($MS_B2B_URL) was down:
HTTP code: 0
Response time: 0 ms
Resolved: MS - B2B is back up in cb928e8 after 7 minutes.
| gharchive/issue | 2024-08-27T23:23:57 | 2025-04-01T06:36:39.147782 | {
"authors": [
"DanielVelasquezTreinta"
],
"repo": "30SAS/uptime",
"url": "https://github.com/30SAS/uptime/issues/4271",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1793833185 | 🛑 MS - SHOP is down
In cd7f7a2, MS - SHOP ($MS_SHOP_URL) was down:
HTTP code: 404
Response time: 213 ms
Resolved: MS - SHOP is back up in 2083556.
| gharchive/issue | 2023-07-07T16:33:32 | 2025-04-01T06:36:39.150091 | {
"authors": [
"DanielVelasquezTreinta"
],
"repo": "30SAS/uptime",
"url": "https://github.com/30SAS/uptime/issues/653",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1722854624 | Doesn't work with 3.7.0
hi. it doesn't work with the new update 3.7.0. please update it :)
yeah same problem here, pls fix i cant play this game at 60fps after using this godly fps unlocker
Same here.
+1 on this. Hope tool can be updated for 3.7 and beyond
same cant use it
same problem
Same here. It's too awful playing again with 60 fps
same here
Same here
same here
same here
same here please fix IT thank you
A dark day for gamers...
Yep, doesn't work anymore. Gross how they would go out of their way to sabotage something that literally doesn't hurt anyone in any way. If there's no way around this, i'm quitting this game early.
Finally they patched it.
its over
60 FPS I really can't :(
this wasn't patched, just some internal changes, probably some semantic shit. Just needs the new memory pattern to scan for. Here's the issue
Does not work for me as well :(
Same here. I hope an update will be released soon. My eyes can't handle 60FPS anymore.. they're bleeding..
Meet to
I really hope someone comes in and finds the new memory patterns to fix it.
calm the fuck down, gonna update it soon
calm the fuck down, gonna update it soon
hurrah
I look forward to it
ty
calm the fuck down, gonna update it soon
Based
calm the fuck down, gonna update it soon
goated dev
W dev. Unplayable with 60fps, glad someone actually made this and fixes it ❤
i love <3
진정해, 곧 업데이트할게
기대 할게요 감사
updated, download at Release
updated, download at Release
<3
actualizado, descargar en Release
Hero <3
updated, download at Release
Cool
업데이트됨, 취소 시 다운로드
다시
실행이 안됨
업데이트됨, 취소 시 다운로드
다시
실행이 안됨
delete json config and run again
json 구성
방법을 몰라요
json 구성
방법을 몰라요
in directory where .exe file placed delete "fps_config.json"
and Auto Close function actually doesn't work that's why on the next start it again locks on 60 fps
json 구성
방법을 모르요
.exe 파일이 있는 디렉토리에서 "fps_config.json"을 삭제하면 자동 닫기 기능이 실제로 작동하지 않으므로 다음에 시작할 때 다시 60fps로 잠깁니다.
2.1.0 버전에서 실행 하면 작업관리자는 나와있는데 실제는 안나옴
진정해, 곧 업데이트할게
진정해, 곧 업데이트할게
2.1.0.exe 실행시 Setup 설치가 쉽지 않아 문제가 발생합니다 어서 수정하세요!
Someone else has a problem!
https://github.com/34736384/genshin-fps-unlock/issues/139#issue-1723272551
Seeing everyone in a state of panic and confusion because of the tool malfunction really cracks me up. Does mihoyo even realize how poorly their game is designed? They rely on a tool to enhance their 'game experience,' such an arrogant and disgusting company. They really need to reflect on themselves.
진정해, 곧 업데이트할게
2.1.0.exe 실행시 Setup 설치가 쉽지 않아 문제가 발생합니다 어서 수정하세요!
다른 사람이 문제가 있습니다! #139(댓글)
제 생각에는 당신의 PC에 문제가 있는 것 같습니다. 또한 이 프로젝트의 관리자는 귀하에게 솔루션을 제공할 의무가 없습니다. 매너 좀 지켜주세요.
Chrome blocked it because it appeared as a malicious code when downloading it.
works fine. thanks
thank you for fixing it. i can't live without the extra smoothness anymore.
is this program part of hutao? or another independent program?
I downloaded the new patch but i'm still locked at 60 fps, do i need to do something else besides replace the .exe?
Same here, still locked at 60. Tried changing different settings in program and game, but no luck.
new one stopped working also
| gharchive/issue | 2023-05-23T22:07:32 | 2025-04-01T06:36:39.177246 | {
"authors": [
"34736384",
"Dartv",
"DenSwitch",
"Drekaelric",
"GatoTristeY",
"GraveUypo",
"LouisD69",
"MisakaMikoto2333",
"Qepz",
"RyanDVasconcelos",
"SalenGency",
"UnlishedTen83",
"WenchenWang",
"copyvw",
"cybik",
"dioni04",
"hayzar-s",
"hotpot1026",
"jamespmnd",
"kuznetsov-ns",
"lunimater",
"markuskusxyren",
"mio12333",
"narnian19",
"nodaSnowball",
"nofuma99",
"pocasolta01",
"rlawjdtn8890",
"t0xic0der",
"theloraxofdeath",
"twiGGyAJOfficial",
"xzf0509",
"yangtryyds"
],
"repo": "34736384/genshin-fps-unlock",
"url": "https://github.com/34736384/genshin-fps-unlock/issues/137",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
330591831 | Follow best practices for Git commit messages
@xuri Could you follow the established best pratices in Git commit messages ? That would help external contributors to follow changes happening.
https://chris.beams.io/posts/git-commit/
Commit d96440edc480976e3ec48958c68e67f7a506ad32 breaks many rules:
many unrelated changes in a single commit
commit messge does not follow the standard format "1 summary line + 1 empty line + details"
By the way, I expect that the content of CONTRIBUTING.md also applies to project maintainers.
https://github.com/360EntSecGroup-Skylar/excelize/blob/master/CONTRIBUTING.md#commit-messages
Thanks for your suggestion, I will follow best practices in the future code commit.
| gharchive/issue | 2018-06-08T09:46:53 | 2025-04-01T06:36:39.181105 | {
"authors": [
"dolmen",
"xuri"
],
"repo": "360EntSecGroup-Skylar/excelize",
"url": "https://github.com/360EntSecGroup-Skylar/excelize/issues/230",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
} |
2621505832 | fix: fixes conflict-related issues
Fixes #18
Each commit fixes one issue.
Commit 1: Rollbacking a specific array element.
Commit 2: Rollbacking a specific array element (nested).
Commit 3: Rollbacking a number-indexed object.
Explanation
Rollbacking a specific array element
Initial document
{
array: ['a', 'b']
}
Migration query:
{
$set: {
'array.1': 'new b'
}
}
Generated rollback query:
{
"$set": {
"_id": "6720f5a6efc5558c18b3a795",
"array": [
"a",
"b"
]
},
"$unset": {
"array.1": 1
}
}
Proposed fix
For a field X to be added to the $unset section, there must not be any key K from the $set section where ${K}. is included in X (array.1 must not include array. in the example above).
Rollbacking a specific array element (nested)
Initial document
{
array: [
null,
{
nestedArray: [null, 'b']
},
],
};
Migration query:
{
$set: {
'array.1.nestedArray.1': 'new b'
}
}
Generated rollback query:
{
"$set": {
"_id": "6720f5daf91fae9d26b6990f",
"array": [
null
],
"array.1.nestedArray": [
null,
"b"
]
},
"$unset": {
"array.1.nestedArray.1": 1
}
}
The issue comes from the fact that we always set the complete array when we should be updating specific elements sometimes.
It's better in terms of performance to set an array completely rather than individually set its elements, so let's ensure that this remains the standard behavior when possible.
Proposed fix
After flattening the backup document, while iterating on its keys/values, the idea is to add an additional check when we encounter an array to restore: check if among the properties set during the update, a "deeper" key exists.
If that's the case, we should use
rollbackSet[${nestedPathToArray}.${index}] = value;
rather than
rollbackSet[nestedPathToArray][Number(index)] = value;
Rollbacking a number-indexed object
Initial document
{
object: {
0: 'a',
1: 'b'
},
};
Migration query:
{
$set: {
'object.1': 'new b'
}
}
Generated rollback query:
{
"$set": {
"_id": "6720f61a1f16e5169e89423e",
"object": "b"
},
"$unset": {
"object.1": 1
}
}
Individual array element setting have the same syntax as this case.
Fix proposed
Replace rollbackSet[nestedPathToArray] = value; with rollbackSet[key] = value;.
The rollbackSet[nestedPathToArray] = value; line was not covered by tests so I think unexpected side effects on that change are minimal.
Checklist
[x] Test new version on dev (locally).
[x] Test new version on staging (locally).
Hi @LucVidal360 @pp0rtal 👋
As @pp0rtal mentioned, I think it would be a good idea to create a RC and test this version in the CI if possible. I have tested it locally and things work as expected, but an additional check would be welcomed.
@LucVidal360 It sure was! 😁
@hugop95 Thanks a lot for this contribution! :rocket:
| gharchive/pull-request | 2024-10-29T14:54:02 | 2025-04-01T06:36:39.191193 | {
"authors": [
"hugop95",
"pp0rtal"
],
"repo": "360Learning/mongo-bulk-data-migration",
"url": "https://github.com/360Learning/mongo-bulk-data-migration/pull/19",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
188590915 | Utterance Issues
Here are a list of known utterance issues that seem reasonable for a user to say, but Alexa is thinking that the user is trying to invoke a different intent. More will be added as they are found.
Utterance: Clubs
Expected Intent: ClubsCategoryIntent
Actual Intent: AllCategoryIntent
Utterance: how much is it
Expected Intent: GetFeeDetailsIntent
Actual Intent: AllCategoryIntent
3
Utterance: Tell me the sports events
Expected Intent: SportsCategoryIntent
Actual Intent: NextEventIntent
Ok so i think this issue can be rolled into the other one any objections?
4
Utterance: Where is it
Expected Intent: LocationDetailIntent
Actual Intent: AllCategoryIntent
I think we need versions of the detail intent utterances that don't have slots.
5
Utterance: What's happening Monday
Expected Intent: GetEvenstOnDateIntent
Actual Intent: AllCategoryIntent
6
Utterance: What's tomorrow
Expected Intent: GetEventsOnDateIntent
Actual Intent: NextEventIntent
I'm closing this because all the quirks listed here have been fixed. The tree is currently undergoing some big changes, so a new issue may be started for new quirks that are encountered.
| gharchive/issue | 2016-11-10T19:03:40 | 2025-04-01T06:36:39.199992 | {
"authors": [
"eBucher",
"ezquire",
"freqlabs"
],
"repo": "370-Alexa-Project/CS370_Echo_Demo",
"url": "https://github.com/370-Alexa-Project/CS370_Echo_Demo/issues/65",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
759466495 | Server.json
Pingbypass client doesn't create the server.json file for some reason
The Server.json should be created on your VPS by the PingBypass. The Client creates no such file.
this is the client not the server xd
| gharchive/issue | 2020-12-08T13:56:29 | 2025-04-01T06:36:39.215978 | {
"authors": [
"123321ssd",
"3arthqu4ke",
"notperry1"
],
"repo": "3arthqu4ke/PingBypass-Client",
"url": "https://github.com/3arthqu4ke/PingBypass-Client/issues/2",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
623898024 | Added multithreading support
Reason for adding
To enable faster rendering of scenes on cpu's with a high amount of cores vs single core speed.
Allow users to opt in using the new "-j" or "--threads" flag followed by the integer number of threads to use (note that -j is picked because -t is taken and the syntax of gcc is -j4 for 4 cores).
Working Example
My CPU does not benefit massively from it, but using 2 threads gave me a roughly 10-20% speed boost (measured using the time command).
Too many cores slow down processing, base it off your cpu architecture.
Threads can only work on seperate scenes, so enabling extra threads when processing one scene will do nothing.
Makes use of the inbuilt threading library, so there is no need to add more dependencies.
Dude, you made lots of people angry...
Just one little piece of advice: Be careful
Seems outdated, so I'm closing this.
| gharchive/pull-request | 2020-05-24T16:28:54 | 2025-04-01T06:36:39.219811 | {
"authors": [
"DAMO238",
"TonyCrane",
"aliPMPAINT"
],
"repo": "3b1b/manim",
"url": "https://github.com/3b1b/manim/pull/1102",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
141751883 | Fix hardcoded timeout
Note: was introduced as a typo from https://github.com/3scale/3scale_ws_api_for_python/pull/21
@vdel26 Please include this fix as well. Thanks.
| gharchive/pull-request | 2016-03-18T01:50:19 | 2025-04-01T06:36:39.274207 | {
"authors": [
"hafizur-rahman"
],
"repo": "3scale/3scale_ws_api_for_python",
"url": "https://github.com/3scale/3scale_ws_api_for_python/pull/23",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
1896616120 | demo
aetdfgsfdgh
/label test
| gharchive/issue | 2023-09-14T13:46:50 | 2025-04-01T06:36:39.284489 | {
"authors": [
"tjololo"
],
"repo": "418-cloud/testapp",
"url": "https://github.com/418-cloud/testapp/issues/31",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
2370515883 | SolrDedupServiceImpl: potential bug found in method cleanIndex
Bug Description
We found a potential bug in method cleanIndex:
https://github.com/4Science/DSpace/blob/04c7aa622a94e048a056dee462d8220592764971/dspace-api/src/main/java/org/dspace/app/deduplication/service/impl/SolrDedupServiceImpl.java#L637
Should this check be modified to i != null?
As far as I understand the code I would not say this is a bug. If there is an object in the index, which does not exist in the database, it's removed from the index. If it exists it should not be removed, as long as the index should not be erased completely. If it's expected to erase all of the index, this check is not needed at all and all objects should be unindexed (but I guess it does not make sense to erase the index by iterating through all objects).
So I don't think this is a bug and would guess it's intended as it is.
@olli-gold , thanks. That makes sense. I'll close the issue as won't fix.
@olli-gold , I have to reopen this issue.
If i is null, then in
https://github.com/4Science/DSpace/blob/04c7aa622a94e048a056dee462d8220592764971/dspace-api/src/main/java/org/dspace/app/deduplication/service/impl/SolrDedupServiceImpl.java#L642
the null value is passed as second method parameter.
At the end, this will result in a NullPointerException in
https://github.com/4Science/DSpace/blob/04c7aa622a94e048a056dee462d8220592764971/dspace-api/src/main/java/org/dspace/app/deduplication/service/impl/SolrDedupServiceImpl.java#L533
because of null object access in item.getID() and item.getType(). Do you agree?
Oh, yes, you are right. This is obviously a bug, which needs to be fixed.
| gharchive/issue | 2024-06-24T15:19:31 | 2025-04-01T06:36:39.310152 | {
"authors": [
"olli-gold",
"saschaszott"
],
"repo": "4Science/DSpace",
"url": "https://github.com/4Science/DSpace/issues/462",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
} |
1615982014 | pruned and updated EC2 spreadsheet
Updated EC2 spreadsheet using https://instances.vantage.sh/
Hm, I wanted the process to be reproducible, so I avoided manually adding entries. The indexing needed to be changed for this, which is responsible for the missing names (and can be fixed). I'll see what will help.
| gharchive/pull-request | 2023-03-08T21:38:26 | 2025-04-01T06:36:39.319334 | {
"authors": [
"clarabakker"
],
"repo": "4dn-dcic/Benchmark",
"url": "https://github.com/4dn-dcic/Benchmark/pull/2",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
189228600 | Future of RipMe: separation of program + ripper logic
@4pr0n seems to not be maintaining this project on a regular basis anymore. It's totally understandable. I myself haven't had any time lately to dedicate to any side hobby projects, let alone this one. For the most part, RipMe still continues to work for me. (I still use it multiple times a week.)
As discussed in #247 it seems to many people feel like the project is dead -- and/or for their scenarios, it is not working as well any more, and their scenarios are not being updated to fix the problems.
@Wiiplay123 @i-cant-git - since you two have also been contributors to the project and commented on #247, I wonder if we can discuss a potential future for this project? Also if you could also share if you know of any currently-maintained projects that can be used as alternatives?
I think this style of catch-all project is the sort of thing that is unmaintainable in the long term except by a seriously dedicated effort. The problem is that there's essentially no limit to the number of rippers that could be included in this project's source code. Things have gotten really bloated here, and everyone is depending on official updates from a single source to add new rippers. It's hard to know how to prioritize maintenance.
Questions like what rippers are people using most can only be answered by how loudly people complain about the broken ones. There's a lot of more obscure sites that are supported in-box with RipMe (I contributed some of them), and maybe some of the more common ones go by the wayside when trying to support so many.
I've been thinking lately that this project is really in two distinct parts:
There's the core of the project which provides the structure, interface, and framework to use to define the rippers.
There's the rippers themselves, which all more or less follow a consistent algorithm of starting at a URL, navigating through some HTML, extracting image links from the HTML, and queuing those images to be ripped.
I've been thinking it might be a worthwhile effort to separate the two concerns. Keep all of part #1 in the same repo, and expose rippers as a plug-in model. Move the rippers into another repo, maybe keep just the core of the rippers maintained by the main project (in a separate), and make it easy for users to define their own locally on their machine. Add a way to add new ripper sources (github repos, local sources, links to individual definition files) in the RipMe UI.
Rippers could even be written in a non-compiled scripting language like JavaScript (since the JDK has a built-in ScriptEngine for JavaScript), or if we can separate concerns well enough, define the logic in a simple description language like JSON. If we could do that, individuals could maintain their own rippers, and we could provide links to known ripper sources, as well as including a few of those sources in the default application configuration.
Pages that host image content generally look like one of the following:
Images are embedded directly in the page (action: download the embedded images)
Thumbnails link to full-size images (action: download the images at the links)
Thumbnails link to another page like 1 (action: load the linked page and then use action 1)
Thumbnails link to another gallery page like 2 or 3 (action: load the page and then use action 2 or 3).
Thumbnails link through an ad-wall which redirects to an image or a page like 1 or 2 (I'm not sure if we currently have any rippers which automate getting through the ad wall)
The sites we are interested in making rippers for are either one of the above, or a social style of website where users aggregate content by linking to pages like the above.
For sites formatted like 1 and 2 (example: 4chan threads are formatted like 2), AND where all content is on a single page (whether the content is embedded or linked), there are already many tools which download content from any arbitrary website (no specific ripper logic would really be needed in that case, and actually significantly restricts the usefulness of the ripper). Here's a recommendation for a download manager that can deal with that kind of website (Firefox only, unfortunately, but since RipMe users are using an external program to do our image downloads, I'm sure that's okay): http://www.downthemall.net/
For me, that covers a lot of sites I'm interested in that don't already have rippers, and also covers a lot of sites that do already have rippers. In that case, the rippers are probably redundant.
For sites like 1 and 2 where all the content isn't on a single page, we need to supply some logic to navigate from page to page, and otherwise, the generic techniques for 1 and 2 can be automatically applied once we get to a page where they apply.
Sometimes it is possible to construct the URL of the image from the thumbnail in a gallery in style 3. The e-hentai ripper is an example of this. Following a technique like that saves us from loading a ton of additional pages, which saves time and keeps the ripper from getting blocked because it made too many requests in a short period of time (DDOS detection or REST API limiting).
One place a program like this helps a lot is for sites like Tumblr and Instagram that deliberately make it difficult for a user to download the content by either blocking right-clicking or by obscuring the image in the web page somehow that makes it difficult or impossible to right-click and save. But, because those images are downloaded into the page, it is possible for us to get those links and download them to save on the user's computer. The location to find the URL on the page is usually easily extracted with some simple HTML-traversal logic. This is the sort of automation we strive to allow with RipMe.
I think the biggest use-case for this application is mostly for websites that host community-generated content in large or even indefinitely-sized albums, especially when that content is spread out over many pages: Reddit (subreddits, user profiles), Imgur (mainly because of heavy use in Reddit), Tumblr, Instagram.
Those are just some thoughts. There's likely to be more.
Summary of action items:
To reduce Ripper maintenance, enable automatic detection of page styles 1 and 2 and do the right thing in those cases. Then, remove rippers with only that basic logic. Possibly, add a whitelist of URL patterns known to be page styles 1 and 2, so that the user never needs to know there's no longer a dedicated Ripper for those pages.
Page styles 3 and 4 could be automatically detected and ripped, but we should be careful to add delays so that the Ripper doesn't get blocked for requesting too many pages at once. Rippers that use the gallery to deduce the actual image URLs should be kept. This style of ripper logic would likely be easy to encode as a simple RegEx like s/(.*foobar\.com.*)\/thumb(\/.*\.jpg)/\1\2/ -- remove the /thumb/ from the path.
Page style 5 would be difficult to detect automatically without trying to navigate the pages but we might be able to add logic to automatically click through different kinds of ad-walls like adf.ly. Once we get to the other side of the ad-wall we could try to automatically detect the type of page and do the right thing.
After that, any Rippers which meaningfully improve performance or reliability could be added to one of the Ripper galleries (either the separate repo maintained by the RipMe project maintainers, or a third party ripper repo).
Even for the well-known gallery types, it's still nice to automatically detect an album name from the page. I think currently, we don't ever try to detect an album name and instead let the ripper decide it. Still nice to let the ripper decide if it wants to, but detecting the title from the page would be nice as well, as long as we're going ahead with automatic detection.
Additionally, I realized that if we release at least some rippers via separate repos, especially as non-compiled scripting or spec code, we don't have to re-release (and force users to re-install) a new version of RipMe for every small fix or update to the rippers. We just download the new ripper definitions and continue with the same version of the software.
I'd propose moving to version 2.0 if we refactor the API this way, and make sure that we use SemVer with respect to the ripper definition interface to ensure compatibility of rippers with a particular version of RipMe.
@4pr0n on a related note, would you consider adding some of us who have previously contributed to this project as contributors so that we can manage issues/PRs and not have to hijack this project on a fork or simply leave the project to die?
I nominate myself as a maintainer :)
This looks great. Has the project been forked?
@seattle255 In fact, @4pr0n just added me as a collaborator (I requested via PM over on Reddit) so I guess I'm at least partially taking over management of the project.
First things first is to bring the project up-to-date for various changes to the websites that have partially or completely broken the rippers. Before making any big changes we need to fix some things. Time to start merging some pull requests! (Although that might need to wait until I'm free after the holidays.)
okay, first of all, i have pretty much zero knowledge on how all of this works but i have been using this tool for a long time now and appreciate how much work and effort you guys put into it.
Im basically an end user who barely knows how to tinker with the configs. pretty much all ive done in this project is to suggest websites to be added and etc.
I am wondering how this would affect me, or pretty much a significant amount of other users. my interpretation is that you guys are planning on lets say, making a separate "game' with the ripper with rippers for other sites acting as additional "dlcs" or whatever analogy fits better
anyway good luck to you guys and whatever youre all planning to do and ill gladly help test them out on all the sites ive used ripme on and other possible sites as well
@ravenstorm767 - There's nothing to worry about. The program would work the same way as before, with some new features, as described above. Anything would either just work, or would work like installing a plug-in, to support new websites.
I do hear you that making the project more complicated for the users is a non-goal. I'll reconsider some of what I've proposed to ensure that the project stays simple and easy to use.
The main motivation here is to reorganize the code to make things a bit easier to maintain. As you've probably noticed, there's a lot of interest in a large number of websites for this program to support, and a serious lack of man-hours to support this project. The less that needs maintaining in the core project, the easier it will be on the maintainers.
If we could make a large number of websites "just work" without specific support, that would be a huge step forward in reducing the cost to maintain this project, with no loss of features.
Until I get started, I won't know how feasible these changes will be. Now that the project has an additional maintainer, it will be much easier to keep the project healthy.
Noticed this early comment by @4pr0n: https://github.com/4pr0n/ripme/issues/8#issuecomment-40295011 laying out plans for a generic ripper.
| gharchive/issue | 2016-11-14T21:16:39 | 2025-04-01T06:36:39.362871 | {
"authors": [
"metaprime",
"ravenstorm767",
"seattle255"
],
"repo": "4pr0n/ripme",
"url": "https://github.com/4pr0n/ripme/issues/361",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
237982466 | Problem with BeeLogger
This is my problem
sh: 1: wine: not found
Traceback (most recent call last):
File "bee.py", line 161, in
main()
File "bee.py", line 134, in main
os.rename('dist/k.exe', 'dist/' + name)
OSError: [Errno 2] No such file or directory
Probably you don't have the repo.
See: https://docs.kali.org/general-use/kali-linux-sources-list-repositories
| gharchive/issue | 2017-06-22T21:42:06 | 2025-04-01T06:36:39.395638 | {
"authors": [
"UndeadSec",
"iFireTech"
],
"repo": "4w4k3/BeeLogger",
"url": "https://github.com/4w4k3/BeeLogger/issues/41",
"license": "bsd-3-clause",
"license_type": "permissive",
"license_source": "bigquery"
} |
2482327327 | Getting multiple touch issue when trying to take a screenShot
whenever Iam trying to take a screenShot iam getting the error like (TypeError: Cannot read property 'x' of undefined, js engine: hermes).
please help me with this.
@5up-okamura please respond.
| gharchive/issue | 2024-08-23T05:15:06 | 2025-04-01T06:36:39.426188 | {
"authors": [
"Rootsalman"
],
"repo": "5up-okamura/react-native-draggable-gridview",
"url": "https://github.com/5up-okamura/react-native-draggable-gridview/issues/13",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2733530396 | Update index.ts
why not use Date obj to new SolarDay?
The Date type has many pitfalls.
| gharchive/pull-request | 2024-12-11T17:12:19 | 2025-04-01T06:36:39.429445 | {
"authors": [
"6tail",
"Lfan-ke"
],
"repo": "6tail/tyme4ts",
"url": "https://github.com/6tail/tyme4ts/pull/12",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1019092507 | Add math lib to PartialCommonOwnership721.sol
In working on https://github.com/721labs/partial-common-ownership/issues/9, I learned that the non-deterministic test results are caused by Solidity rounding down during integer division. This PR seeks to fix this by adding a Math library that provides representational-support for floating point numbers.
👋🏻 Adding a Math library ended up being unnecessary to pass the failing tests (which failed as a result of broken tests, not broken logic).
| gharchive/pull-request | 2021-10-06T19:50:40 | 2025-04-01T06:36:39.436228 | {
"authors": [
"will-holley"
],
"repo": "721labs/partial-common-ownership",
"url": "https://github.com/721labs/partial-common-ownership/pull/15",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
276652426 | Get a cookie from a defined path
Hi guys!
Great module! It's very useful!
I have a question for you. I'm going to receive a cookie in my application from an external CRM. It could be stored with path and domain. Using your library, I've tried to retrieve it using the "get" method with cookies's name, but I can't get it if it was created with specified path.
Can I retrieve a cookie specifying the path that was used to create the cookie?
Thanks you in advance!
Jose
Hello,
I am very sorry for the late answer. I am quite busy at the moment and guess that this won't change until the end of the year 😀
Have you managed to solve your issue yet?
I will try to answer your question to the best of my knowledge. It is possible to get cookies from different paths, if you're doing this on the server side. It is more hacky and not recommended, if you're using JavaScript. Sources:
https://stackoverflow.com/questions/945862/retrieve-a-cookie-from-a-different-path
https://www.sitepoint.com/community/t/access-cookie-of-different-more-specific-path-but-same-domain/6475/4
I hope this helps.
Cheers
PS: Again, if you solved your issue I would appreciate if you could share your solution here, so that others can use your experience for future reference 👍
| gharchive/issue | 2017-11-24T15:51:23 | 2025-04-01T06:36:39.448424 | {
"authors": [
"CunningFatalist",
"blecua84"
],
"repo": "7leads/ngx-cookie-service",
"url": "https://github.com/7leads/ngx-cookie-service/issues/7",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
2404798740 | Fix: OCI registry when releasing helm chart
Installation of helm chart use this path in the OCI registry: oci://8gears.container-registry.com/library/n8n but with latest release it is required to add an extra path /n8n
When trying:
helm pull oci://8gears.container-registry.com/library/n8n --version 0.24.0
Error: 8gears.container-registry.com/library/n8n:0.24.0: not found
it works when
helm pull oci://8gears.container-registry.com/library/n8n/n8n --version 0.24.0
But that's not the path used in the helm chart documentation. CF: https://artifacthub.io/packages/helm/open-8gears/n8n/
You can also confirm that latest release 0.24.0 is not available in the helm chart. CF: https://artifacthub.io/packages/helm/open-8gears/n8n/?modal=changelog
Summary by CodeRabbit
Chores
Updated Helm chart push configuration to a new location within the container registry.
@albertollamaso thank you, for correcting that typo.
new release is out
Thanks @Vad1mo for quick action on this. I am now able to pull the release version 0.24.0 using the proper OCI ur.
Perhaps it is not showing in the UI: https://artifacthub.io/packages/helm/open-8gears/n8n/?modal=changelog
Not sure if an extra step is required in artifacthub.io to be honest.
| gharchive/pull-request | 2024-07-12T05:41:28 | 2025-04-01T06:36:39.479609 | {
"authors": [
"Vad1mo",
"albertollamaso"
],
"repo": "8gears/n8n-helm-chart",
"url": "https://github.com/8gears/n8n-helm-chart/pull/105",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
167855230 | Update Logger.php
Q
A
Bug fix?
yes
New feature?
no
BC breaks?
no
Deprecations?
no
License
MIT
$context['request'] could be null (i.e. when the requested URL could not be resolved - DNS-wise)
Which results in:
Type error: Argument 1 passed to EightPoints\Bundle\GuzzleBundle\Log\LogResponse::__construct() must be an instance of Psr\Http\Message\ResponseInterface, null given
Nice! Thanks for the fix!
Going to merge it after Travis CI tests and create a new bugfix version (5.0.1).
| gharchive/pull-request | 2016-07-27T13:35:01 | 2025-04-01T06:36:39.483278 | {
"authors": [
"florianpreusner",
"mathielen"
],
"repo": "8p/GuzzleBundle",
"url": "https://github.com/8p/GuzzleBundle/pull/57",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2157608194 | 🛑 instagram is down
In 8c2f40d, instagram (https://www.instagram.com/9renpoto/) was down:
HTTP code: 429
Response time: 179 ms
Resolved: instagram is back up in 6351b4c after 8 minutes.
| gharchive/issue | 2024-02-27T21:13:23 | 2025-04-01T06:36:39.498754 | {
"authors": [
"9renpoto"
],
"repo": "9renpoto/upptime",
"url": "https://github.com/9renpoto/upptime/issues/1343",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1499060636 | 🛑 instagram is down
In 5e7727b, instagram (https://www.instagram.com/9renpoto/) was down:
HTTP code: 429
Response time: 121 ms
Resolved: instagram is back up in a3ab889.
| gharchive/issue | 2022-12-15T20:45:07 | 2025-04-01T06:36:39.501340 | {
"authors": [
"9renpoto"
],
"repo": "9renpoto/upptime",
"url": "https://github.com/9renpoto/upptime/issues/446",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2710918118 | 🛑 instagram is down
In 1ebc620, instagram (https://www.instagram.com/9renpoto/) was down:
HTTP code: 429
Response time: 304 ms
Resolved: instagram is back up in aa9eefc after 8 minutes.
| gharchive/issue | 2024-12-02T07:46:56 | 2025-04-01T06:36:39.503924 | {
"authors": [
"9renpoto"
],
"repo": "9renpoto/upptime",
"url": "https://github.com/9renpoto/upptime/issues/4561",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2719980997 | 🛑 threads is down
In 311472d, threads (https://www.threads.net/@9renpoto) was down:
HTTP code: 429
Response time: 308 ms
Resolved: threads is back up in ce739bb after 26 minutes.
| gharchive/issue | 2024-12-05T10:18:52 | 2025-04-01T06:36:39.506286 | {
"authors": [
"9renpoto"
],
"repo": "9renpoto/upptime",
"url": "https://github.com/9renpoto/upptime/issues/4604",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2721119730 | 🛑 threads is down
In d16510c, threads (https://www.threads.net/@9renpoto) was down:
HTTP code: 429
Response time: 378 ms
Resolved: threads is back up in f4ec3dc after 12 minutes.
| gharchive/issue | 2024-12-05T18:29:21 | 2025-04-01T06:36:39.508633 | {
"authors": [
"9renpoto"
],
"repo": "9renpoto/upptime",
"url": "https://github.com/9renpoto/upptime/issues/4613",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2584053111 | 集合コンパートメントの残留放射能について線源臓器Otherからの寄与を考慮する
Close #117
集合コンパートメントの構成要素として、Otherからの寄与を加えるよう変更した。寄与率については、集合コンパートメントの一部となる線源領域の重量総計を、Other全体の重量総計で割った重量比を使用した。
また https://github.com/9rnsr/FlexID/issues/117#issuecomment-2409019405 に示したように、Skeleton*についてはOtherからの寄与を考慮しないよう例外処理を行っている。
summary (2024-10-13 AddOtherToComposite).xlsx
結果としては、消化管と肺の残留放射能が、多くの核種・出力時間メッシュの大部分で一致するようになり、大幅な改善が見られた。
当初、肺の不一致については(吸入摂取では関連コンパートメントがインプットで全て明示されているため)この問題の影響は大きくないと考えていたが、実際はそうではなかった。上記の結果が出てから気づいたが、胸部領域を構成する線源領域のうち、重量比で最も大きいのはALV(66%程度)、次に大きいのはLung-Tis(30%)であり、Lung-TisはOtherに含まれているため、重量比で30%分の寄与が不足していたと考えれば不一致は当然だったといえる。
| gharchive/pull-request | 2024-10-13T15:32:33 | 2025-04-01T06:36:39.511193 | {
"authors": [
"9rnsr"
],
"repo": "9rnsr/FlexID",
"url": "https://github.com/9rnsr/FlexID/pull/125",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
655146269 | Adjust the discarded region size to be relative to the size of the image
Previous value of 15 works for images smaller than 1000 * 1000 pixels size but if its larger then the region size needs to be larger too.
fixed.
| gharchive/issue | 2020-07-11T06:20:43 | 2025-04-01T06:36:39.512723 | {
"authors": [
"muthuspark"
],
"repo": "9sphere/text-detector",
"url": "https://github.com/9sphere/text-detector/issues/2",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1691817119 | feat: "불필요한 파일 삭제"
팀장님 제가 별 사원의 똥을 열심히 치우느라 주말을 보냈으니 연차 두 배로 부탁드립니다!
good job!
| gharchive/pull-request | 2023-05-02T05:51:02 | 2025-04-01T06:36:39.520029 | {
"authors": [
"WVyun",
"hijump9"
],
"repo": "ABcomYeardream/team_p",
"url": "https://github.com/ABcomYeardream/team_p/pull/4",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2104400752 | Identify Core Datasets needed to run the COSIMA-recipes
We need to make sure that all data required to run the recipe is available.
@max-anu, I believe you have made a list somewhere.
I am especially concerned about the things that are stored in hh5.
We need to clarify what is needed to run the recipes and what we can/will support or not.
| gharchive/issue | 2022-11-16T00:15:34 | 2025-04-01T06:36:39.521181 | {
"authors": [
"rbeucher"
],
"repo": "ACCESS-NRI/COSIMA-recipes-workflow",
"url": "https://github.com/ACCESS-NRI/COSIMA-recipes-workflow/issues/58",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
2493270551 | Historical - Increase jobfs to 1500MB
This pull request increases the jobfs requested for the historical configuration to 1500MB from the default 800MB, with the goal of avoiding jobfs exceeded errors.
Closes historical half of #83.
Note that jobfs is shared over nodes
https://opus.nci.org.au/display/Help/PBS+Directives+Explained#PBSDirectivesExplained--ljobfs=<10GB>
On 48 cpu nodes ACCESS-ESM1.5 uses 8 nodes. If the JOBFS is used in setup it will only be the root node, so the usage will be concentrated on a single node.
Dale wrote a very nice explainer of JOBSFS in case anyone is interested
https://climate-cms.org/posts/2022-11-10-jobfs.html
| gharchive/pull-request | 2024-08-29T01:34:27 | 2025-04-01T06:36:39.523676 | {
"authors": [
"aidanheerdegen",
"blimlim"
],
"repo": "ACCESS-NRI/access-esm1.5-configs",
"url": "https://github.com/ACCESS-NRI/access-esm1.5-configs/pull/84",
"license": "CC-BY-4.0",
"license_type": "permissive",
"license_source": "github-api"
} |
710663068 | how to use data from the query
Hi! I am very new to Grafana, Javascript and SQL. I am now trying to develop an interface on ACE.SVG. I have no trouble with my query. Although, I can't find the way to use it properly with this plugin. Could someone give me a quick example or explanation on how to get my data?
I have seen this demo, but this is very unclear to me. Especially "let buffer = data.series[0].fields[1].values.buffer;".
options.animateLogo = (svgmap, data) => { let buffer = data.series[0].fields[1].values.buffer; let valueCount = buffer.length let chartData = []; for (let i=0; i<valueCount; i+=(Math.floor(valueCount / 4)-1)) { chartData.push(buffer[i]) } let minData = chartData.reduce((acc, val) => { return Math.min(acc, val);
Thank you to help a beginner!
@SamuelJoly
The data variable is how you access the Grafana data frame API.
Grafana has great documentation for that API here: https://grafana.com/docs/grafana/latest/developers/plugins/data-frames/
All the code above is just sampling values from the time series, specifically 4 values, at even spacing, then getting the max and min from that set of 4 values.
Also remember you can use your browser's developer tools and console.log() to dig into any of this.
Here's pseudocode with literal values:
let buffer = [0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,16,17,18,19] // The first series in the data frame, the second column (value rather than timestamp) and the values in the form of a single buffer, or array let valueCount = buffer.length // this is just capturing the length of the series for clarity let chartData = [] // this is just making an empty list for us to put the 4 samples we want into for 0 through 3 as i, get the value at the address i times 1/4 the length of the list, and add that value to the chartData array let minData equal the smallest value in the chartData Array we built
Does any of that help?
Thank you! This makes it very clear!
Might have other questions later ;)
| gharchive/issue | 2020-09-29T00:32:54 | 2025-04-01T06:36:39.530679 | {
"authors": [
"SamuelJoly",
"acedrew"
],
"repo": "ACE-IoT-Solutions/ace-svg-react",
"url": "https://github.com/ACE-IoT-Solutions/ace-svg-react/issues/6",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
260357747 | acme_diags did not produce output, waiting for user input on UVCDAT_ANONYMOUS_LOG ?
The acme_diags did not produce output in the latest run on 9/22 (/p/cscratch/acme/mccoy20/test_2017-09-14-Chris)
It seems that it may be waiting for user input, here is the output of acme_diag_set_1980_1984_8d193.err
[mccoy20@acme1 run_scripts]$ more acme_diag_set_1980_1984_8d193.err
Traceback (most recent call last):
...
File "/export/mccoy20/anaconda2/envs/workflow/lib/python2.7/site-packages/cdms2/init.py", line 6, in
cdat_info.pingPCMDIdb("cdat", "cdms2") # noqa
File "/export/mccoy20/anaconda2/envs/workflow/lib/python2.7/site-packages/cdat_info/cdat_info.py", line 205, in pingP
CMDIdb
askAnonymous(val)
File "/export/mccoy20/anaconda2/envs/workflow/lib/python2.7/site-packages/cdat_info/cdat_info.py", line 164, in askAn
onymous
"(you can also set the environment variable UVCDAT_ANONYMOUS_LOG to yes or no)? [yes]/no: ")
EOFError: EOF when reading a line
@mccoy20 @sterlingbaldwin I think Renata's correct, I've had this issue before. Just set the environmental variable beforehand. So either export UVCDAT_ANONYMOUS_LOG=False or export UVCDAT_ANONYMOUS_LOG=True.
Fixed in the new nightly.
| gharchive/issue | 2017-09-25T17:38:16 | 2025-04-01T06:36:39.538330 | {
"authors": [
"mccoy20",
"sterlingbaldwin",
"zshaheen"
],
"repo": "ACME-Climate/acme_processflow",
"url": "https://github.com/ACME-Climate/acme_processflow/issues/37",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
614005565 | Use the latest patch version in README
Goals :soccer:
Make README up to date
Implementation Details :construction:
Specifying 4.0 instead of full patch version caused problem e.g. as at https://stackoverflow.com/questions/61655457/cocoapods-could-not-find-compatible-versions-for-pod-afnetworking/61657341#61657341
Testing Details :mag:
Documentation update only
Thanks for the PR! However, these are correct: all of the package managers should properly update to the latest version with the current version strings, it's usually just a matter of "update" vs. "install" actions. I like to avoid having to update the README for every version bump. Thanks anyway!
| gharchive/pull-request | 2020-05-07T12:08:05 | 2025-04-01T06:36:39.612242 | {
"authors": [
"jshier",
"matthewmayer"
],
"repo": "AFNetworking/AFNetworking",
"url": "https://github.com/AFNetworking/AFNetworking/pull/4565",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
427273155 | Update to Swift 5.0
The migration was easy. It only appears a couple of warnings.
This is great, just works, thank you!
I manipulated and manually added to my project and worked.
Thanks for your code.
@asam139, I have added your changes under the swift_5 branch.
Thanks for contributing to ARVideoKit!
| gharchive/pull-request | 2019-03-30T11:30:28 | 2025-04-01T06:36:39.613999 | {
"authors": [
"AFathi",
"asam139",
"hpayami",
"vade"
],
"repo": "AFathi/ARVideoKit",
"url": "https://github.com/AFathi/ARVideoKit/pull/79",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
2673576959 | redo pvar indexing logic & lazily load pvar file to save memory
This fixes #7 and also adds some lazy functionality from polars. I did scrap this together rather quickly, so I suggest pulling down, running tests, and building any additional fixes on top of this.
I see the issue - should be fixed in the third commit
| gharchive/pull-request | 2024-11-19T21:26:20 | 2025-04-01T06:36:39.632292 | {
"authors": [
"kscott-1"
],
"repo": "AI-sandbox/snputils",
"url": "https://github.com/AI-sandbox/snputils/pull/8",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
} |
1924095378 | added avl tree implementation
PR Description
This code implements avl trees, insertion and deletion in it. It also adds functionalities of different types of traversals
Related Issue
Closes: #46
Changes Made
List the changes you made in this pull request:
Created the file for and implemented Avl Tree Data structure and several operations in it.
Testing
##Manual Testing
I tested the code with several examples of insertions , testing and covering all the edge cases. One of the test case is given in the code as example usage..
Author
Kaival Mehta (kaivalmehta)
@Tinny-Robot If the code is okay can you merge the request? Or do i need to make any changes?
I have made the changes, kindly check it
| gharchive/pull-request | 2023-10-03T12:56:47 | 2025-04-01T06:36:39.642273 | {
"authors": [
"kaivalmehta"
],
"repo": "AIBauchi/PyDS-A",
"url": "https://github.com/AIBauchi/PyDS-A/pull/47",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1722560287 | Avoid Blank Posts and Add Default Photo #50
No Title No Post
No Post
No Title
@rt-001 where is default photo added?
Because of no respone from your side for more than 2 weeks, closed the pr.
@rt-001 where is default photo added?
@AKD-01 I did it in the last PR, but you mentioned that this reduces the user experience. I apologize for not replying earlier, but I completed my work.
Ohhk, if you have completed your work, you may raise a new pr.
| gharchive/pull-request | 2023-05-23T18:12:41 | 2025-04-01T06:36:39.666297 | {
"authors": [
"AKD-01",
"rt-001"
],
"repo": "AKD-01/blogweet",
"url": "https://github.com/AKD-01/blogweet/pull/176",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
724525299 | New CLI and ability to generate testcases from ontology only
In this branch I implemented the possibility to generate test case descriptions from an ontology. I switched from apache command line parser to picocli, bumped dependencies and (that might be problematic) did some code reformat (in an attempt to clean-up the code I used the Intellij Reformat Code feature).
Description
Almost every file has been touched in this branch, as I run code reformat on the project, which changed indentation of lines (quite a lot whitespace-only changes) and order of imports. Let me know, how we could handle this.
Following changes have been made:
introduced subcommands for validate and generate
bumped dependencies
create fat jar package (including all dependencies) using maven-assembly-plugin, new rdfunit-distribution project
prefix read from ontology vann:preferredNamespacePrefix if not given as parameter
Fixes (partly due to dependency updates):
Jena UUID generation
Motivation and Context
Generation of test case description needed for a customer.
How Has This Been Tested?
The command line args for validate work as before, a new subcommand has been introduced for the feature. shell scripts have been adapted accordingly, so there is no change.
Screenshots (if appropriate):
Types of changes
[ ] Bug fix (non-breaking change which fixes an issue)
[x] New feature (non-breaking change which adds functionality)
[ ] Breaking change (fix or feature that would cause existing functionality to change)
Checklist:
[?] My code follows the code style of this project.
[x] My change requires a change to the documentation.
[-] I have updated the documentation accordingly.
[-] I have added tests to cover my changes.
[-] All new and existing tests passed.
Thanks a lot for your contribution @mgns !
The whitespace changes makes it hard to review this change, I trust this is good and would be happy to merge as is but would make git history better to read if it would be easy to separate the whitespace changes with the new features.
one possible way could be the following, take the latest master and run the whole project with the same tool you used to create the whitespace changes and make a merge request with that alone. Once we merge that, rerun the same tool again on your branch and then rebase / merge on latest master. We can do a squash merge in the last step to make the current changes visible.
If this approach doesn't work we could merge it as is, wdyt?
Obsolete
Obsolete
| gharchive/pull-request | 2020-10-19T11:29:20 | 2025-04-01T06:36:39.681067 | {
"authors": [
"jimkont",
"mgns"
],
"repo": "AKSW/RDFUnit",
"url": "https://github.com/AKSW/RDFUnit/pull/104",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
2208130287 | Fabricskyblock not working on vulkan
When I installed fabricskyblocks for amazing sky and in my pack have custom sky but still it not working with vulkan so what I do
causes
I have find a fork of skybox name FabricSkyBoxes Interop and it work with your mod and it works fine after installing this fork
| gharchive/issue | 2024-03-26T12:36:35 | 2025-04-01T06:36:39.696031 | {
"authors": [
"TomXD1234"
],
"repo": "AMereBagatelle/fabricskyboxes",
"url": "https://github.com/AMereBagatelle/fabricskyboxes/issues/104",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
537265881 | Tuya-lan-find gives error when attempting to launch
I've run homebridge successfully for some months now, and recently installed the homebridge-tuya-lan module in order to control some Feit color lightbulbs that are not homekit compatible. I modified config.js of homebridge to include the tuya platform, but added no devices to it, since I don't yet have the id and key of the feit bulbs. Homebridge continues to launch successful except to complain of there being no configured devices for tuya. When I attempt to run tuya-lan-find, it give the following error:
/usr/local/lib/node_modules/homebridge-tuya-lan/bin/cli.js:175
let {address, port} = proxy.httpServer.address();
^
TypeError: Cannot read property 'address' of undefined
at proxy.listen (/usr/local/lib/node_modules/homebridge-tuya-lan/bin/cli.js:175:44)
at /usr/local/lib/node_modules/homebridge-tuya-lan/node_modules/http-mitm-proxy/lib/proxy.js:62:14
at /usr/local/lib/node_modules/homebridge-tuya-lan/node_modules/http-mitm-proxy/lib/ca.js:130:14
at /usr/local/lib/node_modules/homebridge-tuya-lan/node_modules/http-mitm-proxy/node_modules/async/dist/async.js:3888:9
at /usr/local/lib/node_modules/homebridge-tuya-lan/node_modules/http-mitm-proxy/node_modules/async/dist/async.js:473:16
at iterateeCallback (/usr/local/lib/node_modules/homebridge-tuya-lan/node_modules/http-mitm-proxy/node_modules/async/dist/async.js:988:17)
at /usr/local/lib/node_modules/homebridge-tuya-lan/node_modules/http-mitm-proxy/node_modules/async/dist/async.js:969:16
at /usr/local/lib/node_modules/homebridge-tuya-lan/node_modules/http-mitm-proxy/node_modules/async/dist/async.js:3885:13
at /usr/local/lib/node_modules/homebridge-tuya-lan/node_modules/mkdirp/index.js:47:53
at FSReqCallback.oncomplete (fs.js:161:21)
I'm running homebridge on Mac OS 10.15.1 on a 16" MacBook Pro.
I ran homebridge in debug mode and saw nothing pop up associated with tuya-lan-find in the console logs.
I found it necessary to launch tuya-lan-find as root in order for it to start the proxy server.
| gharchive/issue | 2019-12-12T23:20:59 | 2025-04-01T06:36:39.700792 | {
"authors": [
"dbrewer333"
],
"repo": "AMoo-Miki/homebridge-tuya-lan",
"url": "https://github.com/AMoo-Miki/homebridge-tuya-lan/issues/129",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
394869533 | Fix deprecation warning
Fixes
[DEPRECATION WARNING]: State 'installed' is deprecated. Using state 'present' instead.. This
feature will be removed in version 2.9. Deprecation warnings can be disabled by setting
deprecation_warnings=False in ansible.cfg.
Please release this to Galaxy
| gharchive/pull-request | 2018-12-30T12:47:32 | 2025-04-01T06:36:39.714316 | {
"authors": [
"attenzione",
"volemont"
],
"repo": "ANXS/hostname",
"url": "https://github.com/ANXS/hostname/pull/18",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
1570360326 | feat: add reference resolution option to allow root level dereferencing
This helps solve https://github.com/APIDevTools/json-schema-ref-parser/issues/199 by adding a new flag, externalReferenceResolution, that allows for reference resolution at the root level
@philsturgeon can you give me admin/bypass permissions on this repo?
You got it.
On Tue, Sep 19, 2023 at 01:48, JonLuca De Caro @.***(mailto:On Tue, Sep 19, 2023 at 01:48, JonLuca De Caro < wrote:
@.***(https://github.com/philsturgeon) can you give me admin/bypass permissions on this repo?
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you were mentioned.Message ID: @.***>
Pull Request Test Coverage Report for Build 6242355444
22 of 29 (75.86%) changed or added relevant lines in 6 files are covered.
1 unchanged line in 1 file lost coverage.
Overall coverage decreased (-0.2%) to 95.79%
Changes Missing Coverage
Covered Lines
Changed/Added Lines
%
lib/refs.ts
2
3
66.67%
lib/index.ts
1
7
14.29%
Files with Coverage Reduction
New Missed Lines
%
lib/index.ts
1
96.76%
Totals
Change from base Build 6242072421:
-0.2%
Covered Lines:
3212
Relevant Lines:
3311
💛 - Coveralls
| gharchive/pull-request | 2023-02-03T19:50:40 | 2025-04-01T06:36:39.800316 | {
"authors": [
"coveralls",
"jonluca",
"philsturgeon"
],
"repo": "APIDevTools/json-schema-ref-parser",
"url": "https://github.com/APIDevTools/json-schema-ref-parser/pull/305",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
582799019 | Can't not bundle in Travis CI
I try to bundle doc in travis but it's not success, Seems, it was a js error
Note, build folder existed
This is error
$ swagger-cli bundle -o build/swagger.bundle.yaml -t yaml swagger.yaml
Cannot read property 'mkdir' of undefined
This is .travis.yml
language: node_js
node_js:
- 8
before_install:
- npm install swagger-cli
- export PATH=$(npm bin):$PATH
script:
- swagger-cli validate swagger.yaml
after_success:
- swagger-cli bundle -o build/swagger.bundle.yaml -t yaml swagger.yaml
deploy:
provider: pages
skip_cleanup: true
github_token: $GH_TOKEN
local_dir: build
on:
branch: master
Node 8 is no longer supported. Change the node_js setting in your Travis CI file to 10 and it'll work
| gharchive/issue | 2020-03-17T07:02:40 | 2025-04-01T06:36:39.802465 | {
"authors": [
"JamesMessinger",
"nkthanh98"
],
"repo": "APIDevTools/swagger-cli",
"url": "https://github.com/APIDevTools/swagger-cli/issues/41",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
182688138 | the code in android made the app die
if(Platform.OS === 'android') {
try {
this.scrollToFocusedInputWithNodeHandle(currentlyFocusedField)
} catch (e) {
}
} else {
UIManager.viewIsDescendantOf(
currentlyFocusedField,
this.getScrollResponder().getInnerViewNode(),
(isAncestor) => {
if (isAncestor) {
// Check if the TextInput will be hidden by the keyboard
UIManager.measureInWindow(currentlyFocusedField, (x, y, width, height) => {
if (y + height > frames.endCoordinates.screenY) {
this.scrollToFocusedInputWithNodeHandle(currentlyFocusedField)
}
})
}
}
)
}
Please, in the meantime disable the automatic scrolling under Android:
enableAutoAutomaticScroll={(Platform.OS === 'ios') ? true : false}
that means this component doesn't work in the Android Platform .
such as this statement '
this.scrollToFocusedInputWithNodeHandle(currentlyFocusedField)
' will made the app die ?
There is no UIManager.viewIsDescendantOf in Android yet. The problem is that if you have multiple scroll views in the same scene, all will listen to the keyboard event but only one contains the TextInput children, so the ones without it will crash. The descendant method avoided the crash under iOS, but I haven't had time yet to send a PR for Android.
| gharchive/issue | 2016-10-13T03:41:18 | 2025-04-01T06:36:39.830163 | {
"authors": [
"ANWSY",
"alvaromb"
],
"repo": "APSL/react-native-keyboard-aware-scroll-view",
"url": "https://github.com/APSL/react-native-keyboard-aware-scroll-view/issues/68",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1429597950 | Перевод на английский схему классов о товаре
Создание единой схемы, правильные связи, перевод на английский.
Цена, продукт, товар, себестоимость
| gharchive/pull-request | 2022-10-31T10:17:34 | 2025-04-01T06:36:39.832176 | {
"authors": [
"AParovyshnaya"
],
"repo": "AParovyshnaya/gena",
"url": "https://github.com/AParovyshnaya/gena/pull/5",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
2095303402 | Gather(ND) dim error
gather_test.tar.gz
Execute command:
./aarch64_build/tests/ExecuteNetwork -N -I 100 -c GpuAcc -m gather_test/gather_dim_test_float32.tflite --reuse-buffers --tflite-executor parser
Warning: No input files provided, input tensors will be filled with 0s.
Info: ArmNN v33.1.0
arm_release_ver of this libmali is 'g6p0-01eac0', rk_so_ver is '5'.
Info: Initialization time: 23.86 ms.
Error: Failed to parse operator #4 within subgraph #0 error: Operation has invalid output dimensions: 3 Output must be an (4 + 1 - 1) -D tensor at function ParseGather [/home/arm-user/source/armnn/src/armnnTfLiteParser/TfLiteParser.cpp:4786]
The Gather operator in the Compute Library used by Arm NN will build the output shape in a specific way:
https://github.com/ARM-software/ComputeLibrary/blob/c2a79a4b8c51ce835eaf984f3a1370447b3282c4/arm_compute/core/utils/misc/ShapeCalculator.h#L1684
The docs for arm_compute::misc::shape_calculator::compute_gather_shape() are:
/** Calculate the gather output shape of a tensor
*
* @param[in] input_shape Input tensor shape
* @param[in] indices_shape Indices tensor shape. Only supports for 2d and 3d indices
* @param[in] actual_axis Axis to be used in the computation
*
* @note Let input_shape be (X,Y,Z) and indices shape (W,O,P) and axis 1
* the new shape is computed by replacing the axis in the input shape with
* the indice shape so the output shape will be (X,W,O,P,Z)
*
* @return the calculated shape
*/
In the failing case provided, we have:
* @note Let input_shape be [1,40,20,4] and indices shape [1] and axis 3
* the new shape is computed by replacing the axis in the input shape with
* the indice shape so the output shape will be [1,40,20,1]
This results in the generated error where: [1,40,20] != [1,40,20,1] as Arm NN is conforming to the requirements of the library it uses and failing on the output tensor shape that was set in the model.
| gharchive/issue | 2024-01-23T06:04:30 | 2025-04-01T06:36:39.930049 | {
"authors": [
"tracyn-arm",
"zxros10"
],
"repo": "ARM-software/armnn",
"url": "https://github.com/ARM-software/armnn/issues/756",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1643695514 | Is there scenario document for SystemReady SR?
docs/ directory has scenario documents for SystemReady ES and IR. There is no such one for SystemReady SR (or LS).
I am trying to create new version of my xBSA checklist page and to add there info which (S)BSA ACS tests need to pass for each entry.
Scenarios cover several tags used by BSA ACS which are not mentioned in BSA specification (DEN0094).
Hi @hrw,
Its a nice suggestion, generally as both ES and SR are very close in terms of BSA rules needs to run on ES or SR systems, ES scenario document is valid for SR also, but with the current names of the documents it might seems that SR guide is missing.
We will discuss internally on two approaches.
Have only two scenario documents ( one for systems that uses Device tree and one for the system using ACPI) + one .md file that reflects which rules are applicable for which band and at what levels (something closer the table mentioned in your checklist page)
Have separate scenario documents for each band.
We will keep you updated on the same.
Thanks,
ACS team
Hi @hrw,
ACS design makes test layer agnostic to the PAL layer. Majority of test algorithm is same across systems with device tree, acpi table or even when running in baremetal enviroment.
Based on ACS design, Single Test Scenario document will be sufficient.
Further a testcase checklist will be added, which will cover
Which test is required for which platforms (IR, ES, SR, baremetal)
Which tests are verified at ACS end and which are not due to the required hardware not being available?
We are planning to upstream the changes by this month end.
Thanks,
ACS team
Hi @hrw,
As part of BSA ACS 1.0.5 release, we have made slight changes to documentation as discussed. (https://github.com/ARM-software/bsa-acs/commit/b30c93dbf6c239d9df1505d8364a0bffbe58a2f6)
Single test scenario document covering test algorithm for a test
testcase checklist which indicates for which Systemready band a test is required to run.
Thanks,
ACS team
Thanks @hrw for raising this, we are closing this as changes are merged.
| gharchive/issue | 2023-03-28T10:57:10 | 2025-04-01T06:36:39.937284 | {
"authors": [
"chetan-rathore",
"hrw"
],
"repo": "ARM-software/bsa-acs",
"url": "https://github.com/ARM-software/bsa-acs/issues/135",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
} |
1176967784 | System hangs at test 861 : PCIe Unaligned access
In the verbose messages (the outputbsa -v 1), test 861 keeps printing the same message for checking bdf 000000.
[ 601.380221] 861 : PCIe Unaligned access START
[ 601.386824]
[ 601.386824] Calculated config address is 28c0600010
[ 601.386824] The BAR value of bdf 060000 is 3011
[ 601.386824] Calculated config address is 28c0000010
[ 601.386824] The BAR value of bdf 000000 is 0
[ 601.386824] Calculated config address is 28c0000010
[ 601.386824] The BAR value of bdf 000000 is 0
[ 601.386824] Calculated config address is 28c0000010
[ 601.386824] The BAR value of bdf 000000 is 0
[ 601.386824] Calculated config address is 28c0000010
It looks like the problem is that there is no break in while loop in https://github.com/ARM-software/bsa-acs/blob/1cc33fea036e4a34dae7f75366e685226a647417/test_pool/pcie/operating_system/test_os_p061.c#L50
The same test (405) in sbsa doesn't have this issue because it has a condition check to break the loop in https://github.com/ARM-software/sbsa-acs/blob/5ccf09073bd4c17cab96bc338e2a3a314bf3a078/test_pool/pcie/test_p005.c#L84
Therefore, we may need to make the change below to fix this issue.
Move the line "next_bdf:" to above the line "while (count--) {"
Remove "count--;" for the places that would run "goto next_bdf;"
This issue has been solved with the PR: https://github.com/ARM-software/bsa-acs/pull/27
Hi @sunnywang-arm,
The fix is merged with #27.
Thanks,
ACS team
| gharchive/issue | 2022-03-22T15:35:00 | 2025-04-01T06:36:39.941563 | {
"authors": [
"chetan-rathore",
"gowthamsiddarthd",
"sunnywang-arm"
],
"repo": "ARM-software/bsa-acs",
"url": "https://github.com/ARM-software/bsa-acs/issues/24",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
} |
169027083 | Juno test Permission denied
I copy the revent and apk files to /usr/local/lib/python2.7/dist-packages/wlauto/workloads/templerun/
then execute the wa , then I found the following error:
2016-08-02 17:43:02,980 INFO Runner: Connecting to device
2016-08-02 17:43:02,984 DEBUG android: Discovering ANDROID_HOME from adb path.
2016-08-02 17:43:02,987 DEBUG android: ANDROID_HOME: /usr
2016-08-02 17:43:02,987 DEBUG android: Using aapt for version android-4.2.2
2016-08-02 17:43:02,987 DEBUG android: adb devices
2016-08-02 17:43:02,991 DEBUG android: adb disconnect 192.168.1.2:5555
2016-08-02 17:43:02,995 DEBUG android: adb connect 192.168.1.2:5555
2016-08-02 17:43:02,999 DEBUG android: connected to 192.168.1.2:5555
2016-08-02 17:43:02,999 DEBUG android:
2016-08-02 17:43:02,999 DEBUG android: adb -s 192.168.1.2:5555 shell " if [ -f /proc/cpuinfo ] ; then true ; else false ; fi"
2016-08-02 17:43:03,057 DEBUG Juno: Polling for device 192.168.1.2:5555...
2016-08-02 17:43:03,058 DEBUG android: adb devices
2016-08-02 17:43:03,064 DEBUG android: adb -s 192.168.1.2:5555 shell "getprop sys.boot_completed"
2016-08-02 17:43:03,143 INFO Runner: Initializing device
2016-08-02 17:43:03,146 DEBUG ExtensionLoader: Checking module wlauto.result_processors
2016-08-02 17:43:03,147 DEBUG ExtensionLoader: Checking module wlauto.result_processors.cpustate
2016-08-02 17:43:03,147 DEBUG ExtensionLoader: Adding result_processor cpustates
2016-08-02 17:43:03,147 DEBUG ExtensionLoader: Checking module wlauto.result_processors.dvfs
2016-08-02 17:43:03,148 DEBUG ExtensionLoader: Adding result_processor dvfs
2016-08-02 17:43:03,148 DEBUG ExtensionLoader: Checking module wlauto.result_processors.ipynb_exporter
2016-08-02 17:43:03,148 DEBUG ExtensionLoader: Adding result_processor ipynb_exporter
2016-08-02 17:43:03,148 DEBUG ExtensionLoader: Checking module wlauto.result_processors.mongodb
2016-08-02 17:43:03,148 DEBUG ExtensionLoader: Adding result_processor mongodb
2016-08-02 17:43:03,149 DEBUG ExtensionLoader: Checking module wlauto.result_processors.notify
2016-08-02 17:43:03,149 DEBUG ExtensionLoader: Adding result_processor notify
2016-08-02 17:43:03,149 DEBUG ExtensionLoader: Checking module wlauto.result_processors.sqlite
2016-08-02 17:43:03,149 DEBUG ExtensionLoader: Adding result_processor sqlite
2016-08-02 17:43:03,149 DEBUG ExtensionLoader: Checking module wlauto.result_processors.standard
2016-08-02 17:43:03,150 DEBUG ExtensionLoader: Adding result_processor csv
2016-08-02 17:43:03,150 DEBUG ExtensionLoader: Adding result_processor standard
2016-08-02 17:43:03,150 DEBUG ExtensionLoader: Adding result_processor summary_csv
2016-08-02 17:43:03,150 DEBUG ExtensionLoader: Adding result_processor json
2016-08-02 17:43:03,150 DEBUG ExtensionLoader: Checking module wlauto.result_processors.status
2016-08-02 17:43:03,151 DEBUG ExtensionLoader: Adding result_processor status
2016-08-02 17:43:03,151 DEBUG ExtensionLoader: Checking module wlauto.result_processors.syeg
2016-08-02 17:43:03,151 DEBUG ExtensionLoader: Adding result_processor syeg_csv
2016-08-02 17:43:03,151 DEBUG ExtensionLoader: Checking module wlauto.result_processors.uxperf
2016-08-02 17:43:03,151 DEBUG ExtensionLoader: Adding result_processor uxperf
2016-08-02 17:43:03,160 DEBUG ExtensionLoader: Checking module wlauto.workloads
2016-08-02 17:43:03,160 DEBUG ExtensionLoader: Checking module wlauto.workloads.andebench
2016-08-02 17:43:03,160 DEBUG ExtensionLoader: Adding workload andebench
2016-08-02 17:43:03,160 DEBUG ExtensionLoader: Checking module wlauto.workloads.androbench
2016-08-02 17:43:03,160 DEBUG ExtensionLoader: Adding workload androbench
2016-08-02 17:43:03,160 DEBUG ExtensionLoader: Checking module wlauto.workloads.angrybirds
2016-08-02 17:43:03,161 DEBUG ExtensionLoader: Adding workload angrybirds
2016-08-02 17:43:03,161 DEBUG ExtensionLoader: Checking module wlauto.workloads.angrybirds_rio
2016-08-02 17:43:03,161 DEBUG ExtensionLoader: Adding workload angrybirds_rio
2016-08-02 17:43:03,161 DEBUG ExtensionLoader: Checking module wlauto.workloads.anomaly2
2016-08-02 17:43:03,161 DEBUG ExtensionLoader: Adding workload anomaly2
2016-08-02 17:43:03,161 DEBUG ExtensionLoader: Checking module wlauto.workloads.antutu
2016-08-02 17:43:03,161 DEBUG ExtensionLoader: Adding workload antutu
2016-08-02 17:43:03,161 DEBUG ExtensionLoader: Checking module wlauto.workloads.apklaunch
2016-08-02 17:43:03,161 DEBUG ExtensionLoader: Adding workload apklaunch
2016-08-02 17:43:03,161 DEBUG ExtensionLoader: Checking module wlauto.workloads.applaunch
2016-08-02 17:43:03,161 DEBUG ExtensionLoader: Adding workload applaunch
2016-08-02 17:43:03,161 DEBUG ExtensionLoader: Checking module wlauto.workloads.audio
2016-08-02 17:43:03,161 DEBUG ExtensionLoader: Adding workload audio
2016-08-02 17:43:03,161 DEBUG ExtensionLoader: Checking module wlauto.workloads.autotest
2016-08-02 17:43:03,161 DEBUG ExtensionLoader: Adding workload autotest
2016-08-02 17:43:03,161 DEBUG ExtensionLoader: Checking module wlauto.workloads.bbench
2016-08-02 17:43:03,161 DEBUG ExtensionLoader: Adding workload bbench
2016-08-02 17:43:03,161 DEBUG ExtensionLoader: Checking module wlauto.workloads.benchmarkpi
2016-08-02 17:43:03,161 DEBUG ExtensionLoader: Adding workload benchmarkpi
2016-08-02 17:43:03,161 DEBUG ExtensionLoader: Checking module wlauto.workloads.blogbench
2016-08-02 17:43:03,161 DEBUG ExtensionLoader: Adding workload blogbench
2016-08-02 17:43:03,161 DEBUG ExtensionLoader: Checking module wlauto.workloads.caffeinemark
2016-08-02 17:43:03,161 DEBUG ExtensionLoader: Adding workload caffeinemark
2016-08-02 17:43:03,162 DEBUG ExtensionLoader: Checking module wlauto.workloads.cameracapture
2016-08-02 17:43:03,162 DEBUG ExtensionLoader: Adding workload cameracapture
2016-08-02 17:43:03,162 DEBUG ExtensionLoader: Checking module wlauto.workloads.camerarecord
2016-08-02 17:43:03,162 DEBUG ExtensionLoader: Adding workload camerarecord
2016-08-02 17:43:03,162 DEBUG ExtensionLoader: Checking module wlauto.workloads.castlebuilder
2016-08-02 17:43:03,162 DEBUG ExtensionLoader: Adding workload castlebuilder
2016-08-02 17:43:03,162 DEBUG ExtensionLoader: Checking module wlauto.workloads.castlemaster
2016-08-02 17:43:03,162 DEBUG ExtensionLoader: Adding workload castlemaster
2016-08-02 17:43:03,162 DEBUG ExtensionLoader: Checking module wlauto.workloads.cfbench
2016-08-02 17:43:03,162 DEBUG ExtensionLoader: Adding workload cfbench
2016-08-02 17:43:03,162 DEBUG ExtensionLoader: Checking module wlauto.workloads.citadel
2016-08-02 17:43:03,162 DEBUG ExtensionLoader: Adding workload citadel
2016-08-02 17:43:03,162 DEBUG ExtensionLoader: Checking module wlauto.workloads.cyclictest
2016-08-02 17:43:03,162 DEBUG ExtensionLoader: Adding workload cyclictest
2016-08-02 17:43:03,162 DEBUG ExtensionLoader: Checking module wlauto.workloads.dex2oat
2016-08-02 17:43:03,162 DEBUG ExtensionLoader: Adding workload dex2oat
2016-08-02 17:43:03,162 DEBUG ExtensionLoader: Checking module wlauto.workloads.dhrystone
2016-08-02 17:43:03,162 DEBUG ExtensionLoader: Adding workload dhrystone
2016-08-02 17:43:03,162 DEBUG ExtensionLoader: Checking module wlauto.workloads.dungeondefenders
2016-08-02 17:43:03,162 DEBUG ExtensionLoader: Adding workload dungeondefenders
2016-08-02 17:43:03,162 DEBUG ExtensionLoader: Checking module wlauto.workloads.ebizzy
2016-08-02 17:43:03,162 DEBUG ExtensionLoader: Adding workload ebizzy
2016-08-02 17:43:03,162 DEBUG ExtensionLoader: Checking module wlauto.workloads.facebook
2016-08-02 17:43:03,162 DEBUG ExtensionLoader: Adding workload facebook
2016-08-02 17:43:03,162 DEBUG ExtensionLoader: Checking module wlauto.workloads.geekbench
2016-08-02 17:43:03,162 DEBUG ExtensionLoader: Adding workload geekbench
2016-08-02 17:43:03,163 DEBUG ExtensionLoader: Checking module wlauto.workloads.glbcorp
2016-08-02 17:43:03,163 DEBUG ExtensionLoader: Adding workload glb_corporate
2016-08-02 17:43:03,163 DEBUG ExtensionLoader: Checking module wlauto.workloads.glbenchmark
2016-08-02 17:43:03,163 DEBUG ExtensionLoader: Adding workload glbenchmark
2016-08-02 17:43:03,163 DEBUG ExtensionLoader: Checking module wlauto.workloads.googlemap
2016-08-02 17:43:03,163 DEBUG ExtensionLoader: Adding workload googlemap
2016-08-02 17:43:03,163 DEBUG ExtensionLoader: Checking module wlauto.workloads.gunbros2
2016-08-02 17:43:03,163 DEBUG ExtensionLoader: Adding workload gunbros2
2016-08-02 17:43:03,163 DEBUG ExtensionLoader: Checking module wlauto.workloads.hackbench
2016-08-02 17:43:03,163 DEBUG ExtensionLoader: Adding workload hackbench
2016-08-02 17:43:03,163 DEBUG ExtensionLoader: Checking module wlauto.workloads.homescreen
2016-08-02 17:43:03,163 DEBUG ExtensionLoader: Adding workload homescreen
2016-08-02 17:43:03,163 DEBUG ExtensionLoader: Checking module wlauto.workloads.hwuitest
2016-08-02 17:43:03,163 DEBUG ExtensionLoader: Adding workload hwuitest
2016-08-02 17:43:03,163 DEBUG ExtensionLoader: Checking module wlauto.workloads.idle
2016-08-02 17:43:03,163 DEBUG ExtensionLoader: Adding workload idle
2016-08-02 17:43:03,163 DEBUG ExtensionLoader: Checking module wlauto.workloads.iozone
2016-08-02 17:43:03,163 DEBUG ExtensionLoader: Adding workload iozone
2016-08-02 17:43:03,163 DEBUG ExtensionLoader: Checking module wlauto.workloads.ironman
2016-08-02 17:43:03,163 DEBUG ExtensionLoader: Adding workload ironman3
2016-08-02 17:43:03,163 DEBUG ExtensionLoader: Checking module wlauto.workloads.krazykart
2016-08-02 17:43:03,163 DEBUG ExtensionLoader: Adding workload krazykart
2016-08-02 17:43:03,163 DEBUG ExtensionLoader: Checking module wlauto.workloads.linpack
2016-08-02 17:43:03,164 DEBUG ExtensionLoader: Adding workload linpack
2016-08-02 17:43:03,164 DEBUG ExtensionLoader: Checking module wlauto.workloads.linpack_cli
2016-08-02 17:43:03,164 DEBUG ExtensionLoader: Adding workload linpack-cli
2016-08-02 17:43:03,164 DEBUG ExtensionLoader: Checking module wlauto.workloads.lmbench
2016-08-02 17:43:03,164 DEBUG ExtensionLoader: Adding workload lmbench
2016-08-02 17:43:03,164 DEBUG ExtensionLoader: Checking module wlauto.workloads.manual
2016-08-02 17:43:03,164 DEBUG ExtensionLoader: Adding workload manual
2016-08-02 17:43:03,164 DEBUG ExtensionLoader: Checking module wlauto.workloads.memcpy
2016-08-02 17:43:03,164 DEBUG ExtensionLoader: Adding workload memcpy
2016-08-02 17:43:03,164 DEBUG ExtensionLoader: Checking module wlauto.workloads.nenamark
2016-08-02 17:43:03,164 DEBUG ExtensionLoader: Adding workload nenamark
2016-08-02 17:43:03,164 DEBUG ExtensionLoader: Checking module wlauto.workloads.peacekeeper
2016-08-02 17:43:03,164 DEBUG ExtensionLoader: Adding workload peacekeeper
2016-08-02 17:43:03,164 DEBUG ExtensionLoader: Checking module wlauto.workloads.power_loadtest
2016-08-02 17:43:03,164 DEBUG ExtensionLoader: Adding workload power_loadtest
2016-08-02 17:43:03,164 DEBUG ExtensionLoader: Checking module wlauto.workloads.quadrant
2016-08-02 17:43:03,164 DEBUG ExtensionLoader: Adding workload quadrant
2016-08-02 17:43:03,164 DEBUG ExtensionLoader: Checking module wlauto.workloads.real_linpack
2016-08-02 17:43:03,164 DEBUG ExtensionLoader: Adding workload real-linpack
2016-08-02 17:43:03,164 DEBUG ExtensionLoader: Checking module wlauto.workloads.realracing3
2016-08-02 17:43:03,164 DEBUG ExtensionLoader: Adding workload realracing3
2016-08-02 17:43:03,164 DEBUG ExtensionLoader: Checking module wlauto.workloads.recentfling
2016-08-02 17:43:03,164 DEBUG ExtensionLoader: Adding workload recentfling
2016-08-02 17:43:03,164 DEBUG ExtensionLoader: Checking module wlauto.workloads.rt_app
2016-08-02 17:43:03,164 DEBUG ExtensionLoader: Adding workload rt-app
2016-08-02 17:43:03,165 DEBUG ExtensionLoader: Checking module wlauto.workloads.shellscript
2016-08-02 17:43:03,165 DEBUG ExtensionLoader: Adding workload shellscript
2016-08-02 17:43:03,165 DEBUG ExtensionLoader: Checking module wlauto.workloads.skypevideo
2016-08-02 17:43:03,165 DEBUG ExtensionLoader: Adding workload skypevideo
2016-08-02 17:43:03,165 DEBUG ExtensionLoader: Checking module wlauto.workloads.smartbench
2016-08-02 17:43:03,165 DEBUG ExtensionLoader: Adding workload smartbench
2016-08-02 17:43:03,165 DEBUG ExtensionLoader: Checking module wlauto.workloads.spec2000
2016-08-02 17:43:03,165 DEBUG ExtensionLoader: Adding workload spec2000
2016-08-02 17:43:03,165 DEBUG ExtensionLoader: Checking module wlauto.workloads.sqlite
2016-08-02 17:43:03,165 DEBUG ExtensionLoader: Adding workload sqlitebm
2016-08-02 17:43:03,165 DEBUG ExtensionLoader: Checking module wlauto.workloads.stream
2016-08-02 17:43:03,165 DEBUG ExtensionLoader: Adding workload stream
2016-08-02 17:43:03,165 DEBUG ExtensionLoader: Checking module wlauto.workloads.stress_ng
2016-08-02 17:43:03,165 DEBUG ExtensionLoader: Adding workload stress_ng
2016-08-02 17:43:03,165 DEBUG ExtensionLoader: Checking module wlauto.workloads.sysbench
2016-08-02 17:43:03,165 DEBUG ExtensionLoader: Adding workload sysbench
2016-08-02 17:43:03,165 DEBUG ExtensionLoader: Checking module wlauto.workloads.telemetry
2016-08-02 17:43:03,165 DEBUG ExtensionLoader: Adding workload telemetry
2016-08-02 17:43:03,165 DEBUG ExtensionLoader: Checking module wlauto.workloads.templerun
2016-08-02 17:43:03,165 DEBUG ExtensionLoader: Adding workload templerun
2016-08-02 17:43:03,165 DEBUG ExtensionLoader: Checking module wlauto.workloads.thechase
2016-08-02 17:43:03,165 DEBUG ExtensionLoader: Adding workload thechase
2016-08-02 17:43:03,165 DEBUG ExtensionLoader: Checking module wlauto.workloads.truckerparking3d
2016-08-02 17:43:03,165 DEBUG ExtensionLoader: Adding workload truckerparking3d
2016-08-02 17:43:03,166 DEBUG ExtensionLoader: Checking module wlauto.workloads.vellamo
2016-08-02 17:43:03,166 DEBUG ExtensionLoader: Adding workload vellamo
2016-08-02 17:43:03,166 DEBUG ExtensionLoader: Checking module wlauto.workloads.video
2016-08-02 17:43:03,166 DEBUG ExtensionLoader: Adding workload video
2016-08-02 17:43:03,166 DEBUG ExtensionLoader: Checking module wlauto.workloads.videostreaming
2016-08-02 17:43:03,166 DEBUG ExtensionLoader: Adding workload videostreaming
2016-08-02 17:43:03,166 DEBUG ExtensionLoader: Checking module wlauto.modules
2016-08-02 17:43:03,166 DEBUG ExtensionLoader: Checking module wlauto.modules.active_cooling
2016-08-02 17:43:03,166 DEBUG ExtensionLoader: Adding module mbed-fan
2016-08-02 17:43:03,166 DEBUG ExtensionLoader: Adding module odroidxu3-fan
2016-08-02 17:43:03,166 DEBUG ExtensionLoader: Checking module wlauto.modules.cgroups
2016-08-02 17:43:03,166 DEBUG ExtensionLoader: Adding module cgroups
2016-08-02 17:43:03,166 DEBUG ExtensionLoader: Checking module wlauto.modules.cpufreq
2016-08-02 17:43:03,166 DEBUG ExtensionLoader: Adding module devcpufreq
2016-08-02 17:43:03,166 DEBUG ExtensionLoader: Checking module wlauto.modules.cpuidle
2016-08-02 17:43:03,166 DEBUG ExtensionLoader: Adding module cpuidle
2016-08-02 17:43:03,166 DEBUG ExtensionLoader: Checking module wlauto.modules.flashing
2016-08-02 17:43:03,166 DEBUG ExtensionLoader: Adding module fastboot
2016-08-02 17:43:03,166 DEBUG ExtensionLoader: Adding module vexpress
2016-08-02 17:43:03,166 DEBUG ExtensionLoader: Checking module wlauto.modules.reset
2016-08-02 17:43:03,167 DEBUG ExtensionLoader: Adding module netio_switch
2016-08-02 17:43:03,168 DEBUG ExtensionLoader: Checking module wlauto.instrumentation
2016-08-02 17:43:03,168 DEBUG ExtensionLoader: Checking module wlauto.instrumentation.coreutil
2016-08-02 17:43:03,168 DEBUG ExtensionLoader: Adding instrument coreutil
2016-08-02 17:43:03,168 DEBUG ExtensionLoader: Checking module wlauto.instrumentation.daq
2016-08-02 17:43:03,168 DEBUG ExtensionLoader: Adding instrument daq
2016-08-02 17:43:03,168 DEBUG ExtensionLoader: Checking module wlauto.instrumentation.delay
2016-08-02 17:43:03,168 DEBUG ExtensionLoader: Adding instrument delay
2016-08-02 17:43:03,168 DEBUG ExtensionLoader: Checking module wlauto.instrumentation.dmesg
2016-08-02 17:43:03,168 DEBUG ExtensionLoader: Adding instrument dmesg
2016-08-02 17:43:03,168 DEBUG ExtensionLoader: Checking module wlauto.instrumentation.energy_model
2016-08-02 17:43:03,168 DEBUG ExtensionLoader: Adding instrument energy_model
2016-08-02 17:43:03,169 DEBUG ExtensionLoader: Checking module wlauto.instrumentation.energy_probe
2016-08-02 17:43:03,169 DEBUG ExtensionLoader: Adding instrument energy_probe
2016-08-02 17:43:03,169 DEBUG ExtensionLoader: Checking module wlauto.instrumentation.fps
2016-08-02 17:43:03,169 DEBUG ExtensionLoader: Adding instrument fps
2016-08-02 17:43:03,169 DEBUG ExtensionLoader: Checking module wlauto.instrumentation.freqsweep
2016-08-02 17:43:03,169 DEBUG ExtensionLoader: Adding instrument freq_sweep
2016-08-02 17:43:03,169 DEBUG ExtensionLoader: Checking module wlauto.instrumentation.hwmon
2016-08-02 17:43:03,169 DEBUG ExtensionLoader: Adding instrument hwmon
2016-08-02 17:43:03,169 DEBUG ExtensionLoader: Checking module wlauto.instrumentation.juno_energy
2016-08-02 17:43:03,169 DEBUG ExtensionLoader: Adding instrument juno_energy
2016-08-02 17:43:03,169 DEBUG ExtensionLoader: Checking module wlauto.instrumentation.misc
2016-08-02 17:43:03,169 DEBUG ExtensionLoader: Adding instrument interrupts
2016-08-02 17:43:03,169 DEBUG ExtensionLoader: Adding instrument sysfs_extractor
2016-08-02 17:43:03,169 DEBUG ExtensionLoader: Adding instrument execution_time
2016-08-02 17:43:03,169 DEBUG ExtensionLoader: Adding instrument cpufreq
2016-08-02 17:43:03,169 DEBUG ExtensionLoader: Checking module wlauto.instrumentation.netstats
2016-08-02 17:43:03,169 DEBUG ExtensionLoader: Adding instrument netstats
2016-08-02 17:43:03,169 DEBUG ExtensionLoader: Checking module wlauto.instrumentation.perf
2016-08-02 17:43:03,169 DEBUG ExtensionLoader: Adding instrument perf
2016-08-02 17:43:03,169 DEBUG ExtensionLoader: Checking module wlauto.instrumentation.pmu_logger
2016-08-02 17:43:03,169 DEBUG ExtensionLoader: Adding instrument cci_pmu_logger
2016-08-02 17:43:03,170 DEBUG ExtensionLoader: Checking module wlauto.instrumentation.poller
2016-08-02 17:43:03,170 DEBUG ExtensionLoader: Adding instrument file_poller
2016-08-02 17:43:03,170 DEBUG ExtensionLoader: Checking module wlauto.instrumentation.screenon
2016-08-02 17:43:03,170 DEBUG ExtensionLoader: Adding instrument screenon
2016-08-02 17:43:03,170 DEBUG ExtensionLoader: Checking module wlauto.instrumentation.servo_power_monitors
2016-08-02 17:43:03,170 DEBUG ExtensionLoader: Adding instrument servo_power
2016-08-02 17:43:03,170 DEBUG ExtensionLoader: Checking module wlauto.instrumentation.streamline
2016-08-02 17:43:03,170 DEBUG ExtensionLoader: Adding instrument streamline
2016-08-02 17:43:03,170 DEBUG ExtensionLoader: Adding resource_getter streamline_resource
2016-08-02 17:43:03,170 DEBUG ExtensionLoader: Checking module wlauto.instrumentation.systrace
2016-08-02 17:43:03,170 DEBUG ExtensionLoader: Adding instrument systrace
2016-08-02 17:43:03,170 DEBUG ExtensionLoader: Checking module wlauto.instrumentation.trace_cmd
2016-08-02 17:43:03,170 DEBUG ExtensionLoader: Adding instrument trace-cmd
2016-08-02 17:43:03,170 DEBUG ExtensionLoader: Checking module wlauto.commands
2016-08-02 17:43:03,170 DEBUG ExtensionLoader: Checking module wlauto.commands.create
2016-08-02 17:43:03,170 DEBUG ExtensionLoader: Adding command create
2016-08-02 17:43:03,170 DEBUG ExtensionLoader: Checking module wlauto.commands.list
2016-08-02 17:43:03,170 DEBUG ExtensionLoader: Adding command list
2016-08-02 17:43:03,171 DEBUG ExtensionLoader: Checking module wlauto.commands.record
2016-08-02 17:43:03,171 DEBUG ExtensionLoader: Adding command record
2016-08-02 17:43:03,171 DEBUG ExtensionLoader: Adding command replay
2016-08-02 17:43:03,171 DEBUG ExtensionLoader: Checking module wlauto.commands.run
2016-08-02 17:43:03,171 DEBUG ExtensionLoader: Adding command run
2016-08-02 17:43:03,171 DEBUG ExtensionLoader: Checking module wlauto.commands.show
2016-08-02 17:43:03,171 DEBUG ExtensionLoader: Adding command show
2016-08-02 17:43:03,172 DEBUG ExtensionLoader: Checking module wlauto.devices
2016-08-02 17:43:03,172 DEBUG ExtensionLoader: Checking module wlauto.devices.android
2016-08-02 17:43:03,172 DEBUG ExtensionLoader: Checking module wlauto.devices.android.gem5
2016-08-02 17:43:03,172 DEBUG ExtensionLoader: Adding device gem5_android
2016-08-02 17:43:03,172 DEBUG ExtensionLoader: Checking module wlauto.devices.android.generic
2016-08-02 17:43:03,172 DEBUG ExtensionLoader: Adding device generic_android
2016-08-02 17:43:03,172 DEBUG ExtensionLoader: Checking module wlauto.devices.android.juno
2016-08-02 17:43:03,172 DEBUG ExtensionLoader: Adding device juno
2016-08-02 17:43:03,172 DEBUG ExtensionLoader: Checking module wlauto.devices.android.nexus10
2016-08-02 17:43:03,172 DEBUG ExtensionLoader: Adding device Nexus10
2016-08-02 17:43:03,172 DEBUG ExtensionLoader: Checking module wlauto.devices.android.nexus5
2016-08-02 17:43:03,173 DEBUG ExtensionLoader: Adding device Nexus5
2016-08-02 17:43:03,173 DEBUG ExtensionLoader: Checking module wlauto.devices.android.note3
2016-08-02 17:43:03,173 DEBUG ExtensionLoader: Adding device Note3
2016-08-02 17:43:03,173 DEBUG ExtensionLoader: Checking module wlauto.devices.android.odroidxu3
2016-08-02 17:43:03,173 DEBUG ExtensionLoader: Adding device odroidxu3
2016-08-02 17:43:03,173 DEBUG ExtensionLoader: Checking module wlauto.devices.android.tc2
2016-08-02 17:43:03,173 DEBUG ExtensionLoader: Adding device TC2
2016-08-02 17:43:03,173 DEBUG ExtensionLoader: Checking module wlauto.devices.linux
2016-08-02 17:43:03,173 DEBUG ExtensionLoader: Checking module wlauto.devices.linux.XE503C12
2016-08-02 17:43:03,173 DEBUG ExtensionLoader: Adding device XE503C12
2016-08-02 17:43:03,173 DEBUG ExtensionLoader: Checking module wlauto.devices.linux.chromeos_test_image
2016-08-02 17:43:03,173 DEBUG ExtensionLoader: Adding device chromeos_test_image
2016-08-02 17:43:03,173 DEBUG ExtensionLoader: Checking module wlauto.devices.linux.gem5
2016-08-02 17:43:03,173 DEBUG ExtensionLoader: Adding device gem5_linux
2016-08-02 17:43:03,173 DEBUG ExtensionLoader: Checking module wlauto.devices.linux.generic
2016-08-02 17:43:03,173 DEBUG ExtensionLoader: Adding device generic_linux
2016-08-02 17:43:03,173 DEBUG ExtensionLoader: Checking module wlauto.devices.linux.odroidxu3_linux
2016-08-02 17:43:03,173 DEBUG ExtensionLoader: Adding device odroidxu3_linux
2016-08-02 17:43:03,174 DEBUG ExtensionLoader: Checking module wlauto.resource_getters
2016-08-02 17:43:03,174 DEBUG ExtensionLoader: Checking module wlauto.resource_getters.standard
2016-08-02 17:43:03,174 DEBUG ExtensionLoader: Adding resource_getter environment_jar
2016-08-02 17:43:03,174 DEBUG ExtensionLoader: Adding resource_getter environment_dependency
2016-08-02 17:43:03,174 DEBUG ExtensionLoader: Adding resource_getter extension_asset
2016-08-02 17:43:03,174 DEBUG ExtensionLoader: Adding resource_getter http_assets
2016-08-02 17:43:03,174 DEBUG ExtensionLoader: Adding resource_getter env_exe_getter
2016-08-02 17:43:03,174 DEBUG ExtensionLoader: Adding resource_getter environment_common_dependency
2016-08-02 17:43:03,174 DEBUG ExtensionLoader: Adding resource_getter filer
2016-08-02 17:43:03,174 DEBUG ExtensionLoader: Adding resource_getter package_exe_getter
2016-08-02 17:43:03,174 DEBUG ExtensionLoader: Adding resource_getter filer_assets
2016-08-02 17:43:03,174 DEBUG ExtensionLoader: Adding resource_getter enviroment_revent
2016-08-02 17:43:03,174 DEBUG ExtensionLoader: Adding resource_getter package_file
2016-08-02 17:43:03,174 DEBUG ExtensionLoader: Adding resource_getter package_apk
2016-08-02 17:43:03,174 DEBUG ExtensionLoader: Adding resource_getter packaged_dependency
2016-08-02 17:43:03,174 DEBUG ExtensionLoader: Adding resource_getter packaged_common_dependency
2016-08-02 17:43:03,174 DEBUG ExtensionLoader: Adding resource_getter environment_file
2016-08-02 17:43:03,174 DEBUG ExtensionLoader: Adding resource_getter exe_getter
2016-08-02 17:43:03,174 DEBUG ExtensionLoader: Adding resource_getter package_revent
2016-08-02 17:43:03,174 DEBUG ExtensionLoader: Adding resource_getter environment_apk
2016-08-02 17:43:03,174 DEBUG ExtensionLoader: Adding resource_getter package_jar
2016-08-02 17:43:03,175 DEBUG ExtensionLoader: Loading from paths.
2016-08-02 17:43:03,175 DEBUG android: adb -s 192.168.1.2:5555 shell "su"
2016-08-02 17:43:04,179 DEBUG check_output: 29480 timed out; sending SIGKILL
2016-08-02 17:43:04,180 DEBUG android: adb -s 192.168.1.2:5555 shell "echo 'cat '\''/sys/devices/system/cpu/online'\''' | su"
2016-08-02 17:43:04,270 DEBUG android: adb -s 192.168.1.2:5555 shell "if [ -e '/sys/devices/system/cpu/cpu0/cpufreq' ]; then echo 1; else echo 0; fi"
2016-08-02 17:43:04,336 DEBUG Juno: Installing module "devcpufreq"
2016-08-02 17:43:04,339 DEBUG android: adb -s 192.168.1.2:5555 shell "if [ -e '/sys/devices/system/cpu/cpuidle' ]; then echo 1; else echo 0; fi"
2016-08-02 17:43:04,404 DEBUG Juno: Installing module "cpuidle"
2016-08-02 17:43:04,405 DEBUG android: adb -s 192.168.1.2:5555 shell "mkdir -p /sdcard/wa-working"
2016-08-02 17:43:04,483 DEBUG android: adb -s 192.168.1.2:5555 shell "mkdir -p /data/local/tmp/wa-bin"
2016-08-02 17:43:04,559 DEBUG android: adb -s 192.168.1.2:5555 shell "if [ -e '/data/local/tmp/wa-bin/busybox' ]; then echo 1; else echo 0; fi"
2016-08-02 17:43:04,622 DEBUG android: adb -s 192.168.1.2:5555 shell "getprop"
2016-08-02 17:43:04,709 DEBUG ResourceResolver: Resolving <no-one's arm64 sqlite3>
2016-08-02 17:43:04,710 DEBUG ResourceResolver: Trying
2016-08-02 17:43:04,710 DEBUG ResourceResolver: Trying
2016-08-02 17:43:04,710 DEBUG ResourceResolver: Trying
2016-08-02 17:43:04,711 DEBUG ResourceResolver: Resource <no-one's arm64 sqlite3> found using :
2016-08-02 17:43:04,711 DEBUG ResourceResolver: /usr/local/lib/python2.7/dist-packages/wlauto/common/bin/arm64/sqlite3
2016-08-02 17:43:04,711 DEBUG android: adb -s 192.168.1.2:5555 shell "if [ -e '/data/local/tmp/wa-bin/sqlite3' ]; then echo 1; else echo 0; fi"
2016-08-02 17:43:04,773 DEBUG android: adb -s 192.168.1.2:5555 shell "/data/local/tmp/wa-bin/busybox which sqlite3"
2016-08-02 17:43:04,844 DEBUG android: adb -s 192.168.1.2:5555 shell "mount"
2016-08-02 17:43:04,931 DEBUG android: adb -s 192.168.1.2:5555 push '/tmp/tmpcA3VQj' '/sdcard/wa-working/disable_screen_lock'
2016-08-02 17:43:05,058 DEBUG android: adb -s 192.168.1.2:5555 shell "echo 'cp /sdcard/wa-working/disable_screen_lock /data/local/tmp/wa-bin/disable_screen_lock' | su"
2016-08-02 17:43:05,169 DEBUG android: adb -s 192.168.1.2:5555 shell "echo 'chmod 0777 /data/local/tmp/wa-bin/disable_screen_lock' | su"
2016-08-02 17:43:05,265 DEBUG android: adb -s 192.168.1.2:5555 shell "echo '/data/local/tmp/wa-bin/disable_screen_lock' | su"
2016-08-02 17:43:05,388 DEBUG android: adb -s 192.168.1.2:5555 shell "getprop ro.build.version.sdk"
2016-08-02 17:43:05,453 DEBUG android: adb -s 192.168.1.2:5555 shell "echo 'getenforce' | su"
2016-08-02 17:43:05,560 INFO Runner: Initializing workloads
2016-08-02 17:43:05,561 DEBUG ResourceResolver: Resolving <'s revent>
2016-08-02 17:43:05,561 DEBUG ResourceResolver: Trying
2016-08-02 17:43:05,562 DEBUG RemoteFilerGetter: resource=<'s revent>, version=None, remote_path=templerun, local_path=/root/.workload_automation/dependencies/templerun
2016-08-02 17:43:05,562 DEBUG android: adb -s 192.168.1.2:5555 shell "getprop"
2016-08-02 17:43:05,647 DEBUG android: adb -s 192.168.1.2:5555 shell "getprop"
2016-08-02 17:43:05,722 DEBUG ResourceResolver: Trying
2016-08-02 17:43:05,723 DEBUG ResourceResolver: Trying
2016-08-02 17:43:05,723 DEBUG android: adb -s 192.168.1.2:5555 shell "getprop"
2016-08-02 17:43:05,802 DEBUG ResourceResolver: Trying
2016-08-02 17:43:05,802 DEBUG android: adb -s 192.168.1.2:5555 shell "getprop"
2016-08-02 17:43:05,878 DEBUG ResourceResolver: Resource <'s revent> found using :
2016-08-02 17:43:05,878 DEBUG ResourceResolver: /usr/local/lib/python2.7/dist-packages/wlauto/workloads/templerun/revent_files/juno.setup.revent
2016-08-02 17:43:05,879 DEBUG ResourceResolver: Resolving <'s revent>
2016-08-02 17:43:05,879 DEBUG ResourceResolver: Trying
2016-08-02 17:43:05,879 DEBUG RemoteFilerGetter: resource=<'s revent>, version=None, remote_path=templerun, local_path=/root/.workload_automation/dependencies/templerun
2016-08-02 17:43:05,879 DEBUG android: adb -s 192.168.1.2:5555 shell "getprop"
2016-08-02 17:43:05,959 DEBUG android: adb -s 192.168.1.2:5555 shell "getprop"
2016-08-02 17:43:06,048 DEBUG ResourceResolver: Trying
2016-08-02 17:43:06,048 DEBUG ResourceResolver: Trying
2016-08-02 17:43:06,048 DEBUG android: adb -s 192.168.1.2:5555 shell "getprop"
2016-08-02 17:43:06,135 DEBUG ResourceResolver: Trying
2016-08-02 17:43:06,135 DEBUG android: adb -s 192.168.1.2:5555 shell "getprop"
2016-08-02 17:43:06,215 DEBUG ResourceResolver: Resource <'s revent> found using :
2016-08-02 17:43:06,215 DEBUG ResourceResolver: /usr/local/lib/python2.7/dist-packages/wlauto/workloads/templerun/revent_files/juno.run.revent
2016-08-02 17:43:06,216 DEBUG android: adb -s 192.168.1.2:5555 shell "getprop"
2016-08-02 17:43:06,294 DEBUG ResourceResolver: Resolving <no-one's arm64 revent>
2016-08-02 17:43:06,295 DEBUG ResourceResolver: Trying
2016-08-02 17:43:06,295 DEBUG ResourceResolver: Trying
2016-08-02 17:43:06,296 DEBUG ResourceResolver: Trying
2016-08-02 17:43:06,296 DEBUG ResourceResolver: Resource <no-one's arm64 revent> found using :
2016-08-02 17:43:06,296 DEBUG ResourceResolver: /usr/local/lib/python2.7/dist-packages/wlauto/common/bin/arm64/revent
2016-08-02 17:43:06,296 DEBUG android: adb -s 192.168.1.2:5555 shell "mount"
2016-08-02 17:43:06,369 DEBUG android: adb -s 192.168.1.2:5555 push '/usr/local/lib/python2.7/dist-packages/wlauto/common/bin/arm64/revent' '/sdcard/wa-working/revent'
2016-08-02 17:43:07,555 DEBUG android: adb -s 192.168.1.2:5555 shell "echo 'cp /sdcard/wa-working/revent /data/local/tmp/wa-bin/revent' | su"
2016-08-02 17:43:07,705 DEBUG android: adb -s 192.168.1.2:5555 shell "echo 'chmod 0777 /data/local/tmp/wa-bin/revent' | su"
2016-08-02 17:43:07,800 DEBUG android: adb -s 192.168.1.2:5555 push '/usr/local/lib/python2.7/dist-packages/wlauto/workloads/templerun/revent_files/juno.run.revent' '/sdcard/wa-working/juno.run.revent'
2016-08-02 17:43:07,996 DEBUG android: adb -s 192.168.1.2:5555 push '/usr/local/lib/python2.7/dist-packages/wlauto/workloads/templerun/revent_files/juno.setup.revent' '/sdcard/wa-working/juno.setup.revent'
2016-08-02 17:43:08,177 DEBUG android: adb -s 192.168.1.2:5555 shell "getprop"
2016-08-02 17:43:08,256 DEBUG ResourceResolver: Resolving <'s arm64 APK>
2016-08-02 17:43:08,256 DEBUG ResourceResolver: Trying
2016-08-02 17:43:08,257 DEBUG RemoteFilerGetter: resource=<'s arm64 APK>, version=None, remote_path=templerun, local_path=/root/.workload_automation/dependencies/templerun
2016-08-02 17:43:08,257 DEBUG ResourceResolver: Trying
2016-08-02 17:43:08,257 DEBUG ResourceResolver: Trying
2016-08-02 17:43:08,257 DEBUG ResourceResolver: Trying
2016-08-02 17:43:08,257 DEBUG ResourceResolver: Resource <'s arm64 APK> found using :
2016-08-02 17:43:08,257 DEBUG ResourceResolver: /usr/local/lib/python2.7/dist-packages/wlauto/workloads/templerun/Temple_Run_2-Hack.apk
2016-08-02 17:43:08,257 DEBUG android: adb -s 192.168.1.2:5555 shell "if [ -f '/etc/arch-release' ]; then echo 1; else echo 0; fi"
2016-08-02 17:43:08,315 DEBUG android: adb -s 192.168.1.2:5555 shell "if [ -d '/etc/arch-release' ]; then echo 1; else echo 0; fi"
2016-08-02 17:43:08,382 DEBUG android: adb -s 192.168.1.2:5555 shell "if [ -f '/etc/debian_version' ]; then echo 1; else echo 0; fi"
2016-08-02 17:43:08,450 DEBUG android: adb -s 192.168.1.2:5555 shell "if [ -d '/etc/debian_version' ]; then echo 1; else echo 0; fi"
2016-08-02 17:43:08,518 DEBUG android: adb -s 192.168.1.2:5555 shell "if [ -f '/etc/lsb-release' ]; then echo 1; else echo 0; fi"
2016-08-02 17:43:08,588 DEBUG android: adb -s 192.168.1.2:5555 shell "if [ -d '/etc/lsb-release' ]; then echo 1; else echo 0; fi"
2016-08-02 17:43:08,654 DEBUG android: adb -s 192.168.1.2:5555 shell "if [ -f '/proc/config.gz' ]; then echo 1; else echo 0; fi"
2016-08-02 17:43:08,718 DEBUG android: adb -s 192.168.1.2:5555 shell "/data/local/tmp/wa-bin/busybox zcat /proc/config.gz"
2016-08-02 17:43:08,832 DEBUG android: adb -s 192.168.1.2:5555 shell "if [ -f '/proc/cmdline' ]; then echo 1; else echo 0; fi"
2016-08-02 17:43:08,894 DEBUG android: adb -s 192.168.1.2:5555 shell "cat /proc/cmdline"
2016-08-02 17:43:08,955 DEBUG android: adb -s 192.168.1.2:5555 shell "if [ -f '/proc/cpuinfo' ]; then echo 1; else echo 0; fi"
2016-08-02 17:43:09,023 DEBUG android: adb -s 192.168.1.2:5555 shell "cat /proc/cpuinfo"
2016-08-02 17:43:09,091 DEBUG android: adb -s 192.168.1.2:5555 shell "if [ -f '/proc/version' ]; then echo 1; else echo 0; fi"
2016-08-02 17:43:09,159 DEBUG android: adb -s 192.168.1.2:5555 shell "cat /proc/version"
2016-08-02 17:43:09,227 DEBUG android: adb -s 192.168.1.2:5555 shell "if [ -f '/proc/zconfig' ]; then echo 1; else echo 0; fi"
2016-08-02 17:43:09,295 DEBUG android: adb -s 192.168.1.2:5555 shell "if [ -d '/proc/zconfig' ]; then echo 1; else echo 0; fi"
2016-08-02 17:43:09,363 DEBUG android: adb -s 192.168.1.2:5555 shell "if [ -f '/sys/kernel/debug/sched_features' ]; then echo 1; else echo 0; fi"
2016-08-02 17:43:09,431 DEBUG android: adb -s 192.168.1.2:5555 shell "cat /sys/kernel/debug/sched_features"
2016-08-02 17:43:09,499 DEBUG android: adb -s 192.168.1.2:5555 shell "if [ -f '/sys/kernel/hmp' ]; then echo 1; else echo 0; fi"
2016-08-02 17:43:09,568 DEBUG android: adb -s 192.168.1.2:5555 shell "if [ -d '/sys/kernel/hmp' ]; then echo 1; else echo 0; fi"
2016-08-02 17:43:09,635 DEBUG android: adb -s 192.168.1.2:5555 pull '/sys/kernel/hmp' 'wa_output/__meta/sys.kernel.hmp'
2016-08-02 17:43:10,208 DEBUG android: adb -s 192.168.1.2:5555 shell "getprop"
2016-08-02 17:43:10,293 DEBUG android: adb -s 192.168.1.2:5555 shell "dumpsys window > /sdcard/wa-working/window.dumpsys"
2016-08-02 17:43:12,240 DEBUG android: adb -s 192.168.1.2:5555 pull '/sdcard/wa-working/window.dumpsys' 'wa_output/__meta/window.dumpsys'
2016-08-02 17:43:12,412 DEBUG android: adb -s 192.168.1.2:5555 shell "ls /sys/devices/system/cpu/cpu0/cpuidle"
2016-08-02 17:43:12,495 DEBUG android: adb -s 192.168.1.2:5555 shell "echo 'cat '\''/sys/devices/system/cpu/cpu0/cpuidle/state0/desc'\''' | su"
2016-08-02 17:43:12,576 DEBUG android: adb -s 192.168.1.2:5555 shell "echo 'cat '\''/sys/devices/system/cpu/cpu0/cpuidle/state0/name'\''' | su"
2016-08-02 17:43:12,657 DEBUG android: adb -s 192.168.1.2:5555 shell "echo 'cat '\''/sys/devices/system/cpu/cpu0/cpuidle/state0/latency'\''' | su"
2016-08-02 17:43:12,750 DEBUG android: adb -s 192.168.1.2:5555 shell "echo 'cat '\''/sys/devices/system/cpu/cpu0/cpuidle/state0/power'\''' | su"
2016-08-02 17:43:12,841 DEBUG android: adb -s 192.168.1.2:5555 shell "echo 'cat '\''/sys/devices/system/cpu/cpu0/cpuidle/state1/desc'\''' | su"
2016-08-02 17:43:12,932 DEBUG android: adb -s 192.168.1.2:5555 shell "echo 'cat '\''/sys/devices/system/cpu/cpu0/cpuidle/state1/name'\''' | su"
2016-08-02 17:43:13,025 DEBUG android: adb -s 192.168.1.2:5555 shell "echo 'cat '\''/sys/devices/system/cpu/cpu0/cpuidle/state1/latency'\''' | su"
2016-08-02 17:43:13,117 DEBUG android: adb -s 192.168.1.2:5555 shell "echo 'cat '\''/sys/devices/system/cpu/cpu0/cpuidle/state1/power'\''' | su"
2016-08-02 17:43:13,208 DEBUG android: adb -s 192.168.1.2:5555 shell "echo 'cat '\''/sys/devices/system/cpu/cpu0/cpuidle/state2/desc'\''' | su"
2016-08-02 17:43:13,288 DEBUG android: adb -s 192.168.1.2:5555 shell "echo 'cat '\''/sys/devices/system/cpu/cpu0/cpuidle/state2/name'\''' | su"
2016-08-02 17:43:13,362 DEBUG android: adb -s 192.168.1.2:5555 shell "echo 'cat '\''/sys/devices/system/cpu/cpu0/cpuidle/state2/latency'\''' | su"
2016-08-02 17:43:13,452 DEBUG android: adb -s 192.168.1.2:5555 shell "echo 'cat '\''/sys/devices/system/cpu/cpu0/cpuidle/state2/power'\''' | su"
2016-08-02 17:43:13,545 DEBUG android: adb -s 192.168.1.2:5555 shell "if [ -e '/sys/devices/system/cpu/cpuidle' ]; then echo 1; else echo 0; fi"
2016-08-02 17:43:13,613 DEBUG ResourceResolver: Resolving <juno_energy's file readenergy>
2016-08-02 17:43:13,613 DEBUG ResourceResolver: Trying
2016-08-02 17:43:13,635 DEBUG ResourceResolver: Trying
2016-08-02 17:43:13,635 DEBUG RemoteFilerGetter: resource=<juno_energy's file readenergy>, version=None, remote_path=juno_energy, local_path=/root/.workload_automation/dependencies/juno_energy
2016-08-02 17:43:13,636 DEBUG ResourceResolver: Trying
2016-08-02 17:43:13,636 DEBUG ResourceResolver: Trying
2016-08-02 17:43:13,636 DEBUG ResourceResolver: Trying
2016-08-02 17:43:13,636 DEBUG ResourceResolver: Trying
2016-08-02 17:43:13,637 DEBUG ResourceResolver: Trying
2016-08-02 17:43:13,637 DEBUG ResourceResolver: Resource <juno_energy's file readenergy> found using :
2016-08-02 17:43:13,637 DEBUG ResourceResolver: /usr/local/lib/python2.7/dist-packages/wlauto/instrumentation/juno_energy/readenergy
2016-08-02 17:43:13,637 DEBUG android: adb -s 192.168.1.2:5555 shell "ps | /data/local/tmp/wa-bin/busybox grep readenergy"
2016-08-02 17:43:13,734 DEBUG android: adb -s 192.168.1.2:5555 shell "mount"
2016-08-02 17:43:13,800 DEBUG android: adb -s 192.168.1.2:5555 push '/usr/local/lib/python2.7/dist-packages/wlauto/instrumentation/juno_energy/readenergy' '/sdcard/wa-working/readenergy'
2016-08-02 17:43:14,131 DEBUG android: adb -s 192.168.1.2:5555 shell "echo 'cp /sdcard/wa-working/readenergy /data/local/tmp/wa-bin/readenergy' | su"
2016-08-02 17:43:14,247 DEBUG android: adb -s 192.168.1.2:5555 shell "echo 'chmod 0777 /data/local/tmp/wa-bin/readenergy' | su"
2016-08-02 17:43:14,342 DEBUG android: adb -s 192.168.1.2:5555 shell "getprop"
2016-08-02 17:43:14,422 DEBUG ResourceResolver: Resolving <trace-cmd's arm64 trace-cmd>
2016-08-02 17:43:14,422 DEBUG ResourceResolver: Trying
2016-08-02 17:43:14,423 DEBUG ResourceResolver: Trying
2016-08-02 17:43:14,423 DEBUG ResourceResolver: Trying
2016-08-02 17:43:14,438 DEBUG ResourceResolver: Resource <trace-cmd's arm64 trace-cmd> found using :
2016-08-02 17:43:14,438 DEBUG ResourceResolver: /usr/local/lib/python2.7/dist-packages/wlauto/instrumentation/trace_cmd/bin/arm64/trace-cmd
2016-08-02 17:43:14,439 DEBUG android: adb -s 192.168.1.2:5555 shell "mount"
2016-08-02 17:43:14,524 DEBUG android: adb -s 192.168.1.2:5555 push '/usr/local/lib/python2.7/dist-packages/wlauto/instrumentation/trace_cmd/bin/arm64/trace-cmd' '/sdcard/wa-working/trace-cmd'
2016-08-02 17:43:15,116 DEBUG android: adb -s 192.168.1.2:5555 shell "echo 'cp /sdcard/wa-working/trace-cmd /data/local/tmp/wa-bin/trace-cmd' | su"
2016-08-02 17:43:15,232 DEBUG android: adb -s 192.168.1.2:5555 shell "echo 'chmod 0777 /data/local/tmp/wa-bin/trace-cmd' | su"
2016-08-02 17:43:15,328 DEBUG android: adb -s 192.168.1.2:5555 shell "if [ -e '/sdcard/wa-working/temp-fs-cpufreq' ]; then echo 1; else echo 0; fi"
2016-08-02 17:43:15,393 DEBUG android: adb -s 192.168.1.2:5555 shell "echo 'mkdir -p /sdcard/wa-working/temp-fs-cpufreq' | su"
2016-08-02 17:43:15,503 DEBUG android: adb -s 192.168.1.2:5555 shell "echo 'mount -t tmpfs -o size=32m tmpfs /sdcard/wa-working/temp-fs-cpufreq' | su"
2016-08-02 17:43:15,617 INFO Runner: Running workload 1 templerun (iteration 1)
2016-08-02 17:43:15,617 DEBUG instrumentation: Disabling instrument juno_energy
2016-08-02 17:43:15,618 DEBUG instrumentation: Disabling instrument execution_time
2016-08-02 17:43:15,618 DEBUG instrumentation: Disabling instrument interrupts
2016-08-02 17:43:15,618 DEBUG instrumentation: Disabling instrument trace-cmd
2016-08-02 17:43:15,618 DEBUG instrumentation: Disabling instrument cpufreq
2016-08-02 17:43:15,618 DEBUG instrumentation: Disabling instrument fps
2016-08-02 17:43:15,619 DEBUG instrumentation: Enabling instrument execution_time
2016-08-02 17:43:15,619 DEBUG instrumentation: Enabling instrument interrupts
2016-08-02 17:43:15,619 DEBUG instrumentation: Enabling instrument cpufreq
2016-08-02 17:43:15,619 DEBUG instrumentation: Enabling instrument fps
2016-08-02 17:43:15,620 DEBUG instrumentation: Enabling instrument trace-cmd
2016-08-02 17:43:15,620 DEBUG instrumentation: Enabling instrument juno_energy
2016-08-02 17:43:15,620 DEBUG Runner: Setting up device parameters
2016-08-02 17:43:15,622 DEBUG Runner: running templerun
2016-08-02 17:43:15,622 INFO Runner: Setting up
2016-08-02 17:43:15,622 DEBUG android: adb -s 192.168.1.2:5555 shell "dumpsys package com.imangi.templerun"
2016-08-02 17:43:15,700 DEBUG android: /usr/build-tools/android-4.2.2/aapt dump badging /usr/local/lib/python2.7/dist-packages/wlauto/workloads/templerun/Temple_Run_2-Hack.apk
2016-08-02 17:43:15,723 INFO Runner: Skipping the rest of the iterations for this spec.
2016-08-02 17:43:15,723 ERROR Runner: Error while running templerun
2016-08-02 17:43:15,723 ERROR Runner: OSError("[Errno 13] Permission denied")
2016-08-02 17:43:15,737 ERROR Runner: File "/usr/local/lib/python2.7/dist-packages/wlauto/core/execution.py", line 739, in _handle_errors
2016-08-02 17:43:15,737 ERROR Runner: yield
2016-08-02 17:43:15,737 ERROR Runner: File "/usr/local/lib/python2.7/dist-packages/wlauto/core/execution.py", line 618, in _run_job
2016-08-02 17:43:15,737 ERROR Runner: self._run_workload_iteration(spec.workload)
2016-08-02 17:43:15,737 ERROR Runner: File "/usr/local/lib/python2.7/dist-packages/wlauto/core/execution.py", line 673, in _run_workload_iteration
2016-08-02 17:43:15,737 ERROR Runner: workload.setup(self.context)
2016-08-02 17:43:15,737 ERROR Runner: File "/usr/local/lib/python2.7/dist-packages/wlauto/common/android/workload.py", line 474, in setup
2016-08-02 17:43:15,737 ERROR Runner: ApkWorkload.setup(self, context)
2016-08-02 17:43:15,737 ERROR Runner: File "/usr/local/lib/python2.7/dist-packages/wlauto/common/android/workload.py", line 196, in setup
2016-08-02 17:43:15,737 ERROR Runner: self.initialize_package(context)
2016-08-02 17:43:15,737 ERROR Runner: File "/usr/local/lib/python2.7/dist-packages/wlauto/common/android/workload.py", line 204, in initialize_package
2016-08-02 17:43:15,737 ERROR Runner: self.initialize_with_host_apk(context, installed_version)
2016-08-02 17:43:15,737 ERROR Runner: File "/usr/local/lib/python2.7/dist-packages/wlauto/common/android/workload.py", line 217, in initialize_with_host_apk
2016-08-02 17:43:15,737 ERROR Runner: host_version = ApkInfo(self.apk_file).version_name
2016-08-02 17:43:15,737 ERROR Runner: File "/usr/local/lib/python2.7/dist-packages/wlauto/utils/android.py", line 157, in init
2016-08-02 17:43:15,737 ERROR Runner: self.parse(path)
2016-08-02 17:43:15,737 ERROR Runner: File "/usr/local/lib/python2.7/dist-packages/wlauto/utils/android.py", line 163, in parse
2016-08-02 17:43:15,737 ERROR Runner: output = subprocess.check_output(command)
2016-08-02 17:43:15,737 ERROR Runner: File "/usr/lib/python2.7/subprocess.py", line 537, in check_output
2016-08-02 17:43:15,737 ERROR Runner: process = Popen(stdout=PIPE, *popenargs, **kwargs)
2016-08-02 17:43:15,737 ERROR Runner: File "/usr/lib/python2.7/subprocess.py", line 679, in init
2016-08-02 17:43:15,737 ERROR Runner: errread, errwrite)
2016-08-02 17:43:15,737 ERROR Runner: File "/usr/lib/python2.7/subprocess.py", line 1249, in _execute_child
2016-08-02 17:43:15,737 ERROR Runner: raise child_exception
2016-08-02 17:43:15,737 ERROR Runner:
2016-08-02 17:43:15,737 INFO Runner: Job status was FAILED. Retrying...
2016-08-02 17:43:15,737 INFO Runner: Skipping workload 1 templerun (iteration 2)
2016-08-02 17:43:15,737 INFO Runner: Skipping workload 1 templerun (iteration 3)
2016-08-02 17:43:15,737 INFO Runner: Skipping workload 1 templerun (iteration 4)
2016-08-02 17:43:15,738 INFO Runner: Skipping workload 1 templerun (iteration 5)
2016-08-02 17:43:15,738 INFO Runner: Skipping workload 1 templerun (iteration 6)
2016-08-02 17:43:15,738 DEBUG instrumentation: Enabling instrument juno_energy
2016-08-02 17:43:15,738 DEBUG instrumentation: Enabling instrument execution_time
2016-08-02 17:43:15,738 DEBUG instrumentation: Enabling instrument interrupts
2016-08-02 17:43:15,738 DEBUG instrumentation: Enabling instrument trace-cmd
2016-08-02 17:43:15,738 DEBUG instrumentation: Enabling instrument cpufreq
2016-08-02 17:43:15,738 DEBUG instrumentation: Enabling instrument fps
2016-08-02 17:43:15,738 INFO Runner: Finalizing workloads
2016-08-02 17:43:15,738 INFO Runner: Finalizing.
Is permission denied Juno problem or WA problem?
Instead of using /usr/local/lib/python2.7/dist-packages/wlauto/workloads/templerun/ try using ~/.workload_automation/dependencies/templerun/
Assumed resolved
| gharchive/issue | 2016-08-03T01:30:04 | 2025-04-01T06:36:40.064085 | {
"authors": [
"ep1cman",
"luodaidong"
],
"repo": "ARM-software/workload-automation",
"url": "https://github.com/ARM-software/workload-automation/issues/218",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
} |
159450812 | Fix #34: Do not try to use a va_list twice
If the prefix function was set, mbed_vtracef would pass ap to vsnprintf to
determine the output length, then use ap again for the actual output,
running off the real arguments.
Create a copy of ap for the initial scan. (Thank you to C99, who added
va_copy. Would have been stuck without it.)
missing test :)
Verified with mbed-client-testapp on Linux , works fine
+1
thanks :)
Expanded the existing prefix function test to take 2 arguments.
| gharchive/pull-request | 2016-06-09T16:23:36 | 2025-04-01T06:36:40.138536 | {
"authors": [
"jupe",
"kjbracey-arm",
"yogpan01"
],
"repo": "ARMmbed/mbed-trace",
"url": "https://github.com/ARMmbed/mbed-trace/pull/35",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
604726170 | Migrate release announcements
Background
Release announcements are currently sent to a list of emails originating from the website, but which faces several issues (we don't have a mechanism to extend it anymore and sending out the announcements has an unacceptable delay).
Task Breakdown
[x] create a child page for the release process in confluence for announcement emails and document the remaining subtasks there
[x] ask @danh-arm about the process and create a new release mailing list (eg [email protected])
[x] create an official email address for the purpose of sending the announcements from that (eg [email protected])
[x] compose a last message to the old list with an announcement about the move and
point them to the new announcement list and invite them to the developer lists as well. Have the text reviewed by @danh-arm or @yanesca
[x] send out the last message (on how to do that, see "release announcement" on the Release Process confluence page)
[ ] draft an email template for a release email that is sent out 1 or 2 weeks prior the actual release
[x] revise and improve the template for the second release email. (for a sample release email see "release announcement" on the Release Process confluence page)
[x] have both of the templates reviewed by @danh-arm and @yanesca
This is linked to #3231 in that we want to announce that change at the same time.
The release announcements have been going out on the new announcement list for more than 6 months now. Sending an advance announcement wasn't part of the process and not necessary for the new release process.
Closing this issue as completed.
| gharchive/issue | 2020-04-22T12:41:28 | 2025-04-01T06:36:40.144030 | {
"authors": [
"danh-arm",
"yanesca"
],
"repo": "ARMmbed/mbedtls",
"url": "https://github.com/ARMmbed/mbedtls/issues/3226",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
} |
1620568476 | Cannot build config tsc debug: MLIR not found
I'm currently trying to build the project to work on my side, but I can't get past config_tsc_debug.bat
I'm building using Visual Studio 16 2019
C:\Users\Administrator\Desktop\BetterElytraBot\TypeScriptCompiler\__build\tsc>cmake ../../tsc -G "Visual Studio 16 2019" -A x64 -DCMAKE_BUILD_TYPE=Debug -Wno-dev
-- Selecting Windows SDK version 10.0.19041.0 to target Windows 10.0.19045.
-- MLIR_DIR is C:/Users/Administrator/Desktop/BetterElytraBot/TypeScriptCompiler/3rdParty/llvm/debug/lib/cmake/mlir
-- CMAKE_VS_PLATFORM_TOOLSET_HOST_ARCHITECTURE was x64 and set to x64
CMake Error at CMakeLists.txt:62 (find_package):
Could not find a package configuration file provided by "MLIR" with any of
the following names:
MLIRConfig.cmake
mlir-config.cmake
Add the installation prefix of "MLIR" to CMAKE_PREFIX_PATH or set
"MLIR_DIR" to a directory containing one of the above files. If "MLIR"
provides a separate development package or SDK, be sure it has been
installed.
-- Configuring incomplete, errors occurred!
See also "C:/Users/Administrator/Desktop/BetterElytraBot/TypeScriptCompiler/__build/tsc/CMakeFiles/CMakeOutput.log".
See also "C:/Users/Administrator/Desktop/BetterElytraBot/TypeScriptCompiler/__build/tsc/CMakeFiles/CMakeError.log".
The folder /llvm/debug/lib/mlir does not exist
you need to run prepare_3rdParty.bat and ensure that it finished
then u need to open "tsc" folder and run 2 bats: config_tsc_debug.bat and then build_tsc_debug.bat
then u need to open "tsc" folder and run 2 bats: config_tsc_debug.bat and then build_tsc_debug.bat
I found out that I needed more than 120 GB of storage to build it so I made some space and restarted
This is how it looks like, now
But it looks like there's still some missing folders
I did find an error and a warning from prepare_3rdParty.bat
CMake Warning (dev) at C:/Program Files/CMake/share/cmake-3.24/Modules/GNUInstallDirs.cmake:243 (message):
Unable to determine default CMAKE_INSTALL_LIBDIR directory because no
target architecture is known. Please enable at least one language before
including GNUInstallDirs.
Call Stack (most recent call first):
C:/Users/Administrator/Desktop/BetterElytraBot/TypeScriptCompiler/3rdParty/llvm-project/llvm/cmake/modules/LLVMInstallSymlink.cmake:5 (include)
tools/llvm-ar/cmake_install.cmake:48 (include)
tools/cmake_install.cmake:39 (include)
cmake_install.cmake:69 (include)
This warning is for project developers. Use -Wno-dev to suppress it.
CMake Error at tools/mlir/lib/Dialect/MemRef/Transforms/cmake_install.cmake:37 (file):
file INSTALL cannot find
"C:/Users/Administrator/Desktop/BetterElytraBot/TypeScriptCompiler/__build/llvm/debug/lib/MLIRMemRefTransforms.lib":
No such file or directory.
Call Stack (most recent call first):
tools/mlir/lib/Dialect/MemRef/cmake_install.cmake:38 (include)
tools/mlir/lib/Dialect/cmake_install.cmake:66 (include)
tools/mlir/lib/cmake_install.cmake:40 (include)
tools/mlir/cmake_install.cmake:55 (include)
tools/cmake_install.cmake:45 (include)
cmake_install.cmake:69 (include)
I built the project again
Here are the errors
CMake Warning (dev) at C:/Program Files/CMake/share/cmake-3.24/Modules/GNUInstallDirs.cmake:243 (message):
Unable to determine default CMAKE_INSTALL_LIBDIR directory because no
target architecture is known. Please enable at least one language before
including GNUInstallDirs.
Call Stack (most recent call first):
C:/Users/Administrator/Desktop/BetterElytraBot/TypeScriptCompiler/3rdParty/llvm-project/llvm/cmake/modules/LLVMInstallSymlink.cmake:5 (include)
tools/llvm-ar/cmake_install.cmake:48 (include)
tools/cmake_install.cmake:39 (include)
cmake_install.cmake:69 (include)
This warning is for project developers. Use -Wno-dev to suppress it.
CMake Error at tools/mlir/lib/Dialect/MemRef/Transforms/cmake_install.cmake:37 (file):
file INSTALL cannot find
"C:/Users/Administrator/Desktop/BetterElytraBot/TypeScriptCompiler/__build/llvm/debug/lib/MLIRMemRefTransforms.lib":
No such file or directory.
Call Stack (most recent call first):
tools/mlir/lib/Dialect/MemRef/cmake_install.cmake:38 (include)
tools/mlir/lib/Dialect/cmake_install.cmake:66 (include)
tools/mlir/lib/cmake_install.cmake:40 (include)
tools/mlir/cmake_install.cmake:55 (include)
tools/cmake_install.cmake:45 (include)
cmake_install.cmake:69 (include)
try do delete __build folder and try again
try do delete __build folder and try again
I tried that multiple times
The 4 first builds were all different
And the 2 latests builds ended up with the same content
I'm not exactly sure what exactly could cause this, maybe I have to use Visual Studio 17 2022 instead of 16 2019 but I can't get the compiler to be recognized with this version
try to use "ninja" build tools
Are you building on linux ?
I will try to build the project in linux to see if it works with ninja
| gharchive/issue | 2023-03-13T00:20:51 | 2025-04-01T06:36:40.167298 | {
"authors": [
"ASDAlexander77",
"Edouard127"
],
"repo": "ASDAlexander77/TypeScriptCompiler",
"url": "https://github.com/ASDAlexander77/TypeScriptCompiler/issues/22",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1365287730 | Add autoRIFT golden test pairs for L4,5,7,9
The landsat-4 pair here failed with an OutOfMemoryError: Container killed due to memory usage for all 3 attempts (the log just abruptly ends which is indicative of a memory failure; confirmed by looking at the batch job)
https://hyp3-test-api.asf.alaska.edu/jobs/5efd7f58-0cdf-4663-bed7-68d7b00b16ee
I'll dig up a L4 pair we know runs.
@forrestfwilliams any thoughts as to why the L4 pair failed? Looks to be during the FFT step.
Submitted all the proposed pairs here to production:
https://hyp3-api.asf.alaska.edu/jobs?name=Golden test update for l457
{
"jobs": [
{
"execution_started": true,
"job_id": "75e52782-505b-4127-9107-66218c6691af",
"job_parameters": {
"granules": [
"LC09_L1GT_215109_20220125_20220125_02_T2",
"LC09_L1GT_215109_20220210_20220210_02_T2"
]
},
"job_type": "AUTORIFT",
"name": "Golden test update for l457",
"priority": 9996,
"request_time": "2022-11-03T00:36:26+00:00",
"status_code": "PENDING",
"user_id": "jhkennedy"
},
{
"execution_started": true,
"job_id": "87278854-a560-4bab-8004-4f79c1f5e73f",
"job_parameters": {
"granules": [
"LT05_L1GS_001013_19920425_20200915_02_T2",
"LT05_L1GS_001013_19920628_20200914_02_T2"
]
},
"job_type": "AUTORIFT",
"name": "Golden test update for l457",
"priority": 9998,
"request_time": "2022-11-03T00:36:26+00:00",
"status_code": "PENDING",
"user_id": "jhkennedy"
},
{
"execution_started": true,
"job_id": "d0e55049-c7cd-444b-87b9-6fef0d31b276",
"job_parameters": {
"granules": [
"LE07_L1TP_063018_20040911_20200915_02_T1",
"LE07_L1TP_063018_20040810_20200915_02_T1"
]
},
"job_type": "AUTORIFT",
"name": "Golden test update for l457",
"priority": 9997,
"request_time": "2022-11-03T00:36:26+00:00",
"status_code": "PENDING",
"user_id": "jhkennedy"
},
{
"execution_started": true,
"job_id": "879efe81-adf2-48cb-89db-f4bd88c577e4",
"job_parameters": {
"granules": [
"LT04_L1TP_063018_19880627_20200917_02_T1",
"LT04_L1TP_063018_19880627_20200917_02_T1"
]
},
"job_type": "AUTORIFT",
"name": "Golden test update for l457",
"priority": 9999,
"request_time": "2022-11-03T00:36:26+00:00",
"status_code": "PENDING",
"user_id": "jhkennedy"
}
],
"next": "https://hyp3-api.asf.alaska.edu/jobs?name=Golden+test+update+for+l457&start_token=eyJqb2JfaWQiOiAiZWEyNzkyMzgtM2M0Yi00YWI0LThkOWUtZWQ0OWZhZGZiYzkzIiwgInVzZXJfaWQiOiAiamhrZW5uZWR5IiwgInJlcXVlc3RfdGltZSI6ICIyMDIyLTAyLTE4VDIyOjI5OjQyKzAwOjAwIn0%3D"
}
Woops, accidentally summited the L4 pair with the same reference and secondary scenes. Here's the correct pair, as in the suggestion:
https://hyp3-api.asf.alaska.edu/jobs/4a74c527-f8d4-4ddf-b2ad-a3385f814153
| gharchive/pull-request | 2022-09-07T22:59:04 | 2025-04-01T06:36:40.173901 | {
"authors": [
"jhkennedy"
],
"repo": "ASFHyP3/hyp3-testing",
"url": "https://github.com/ASFHyP3/hyp3-testing/pull/53",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
} |
1650942690 | 🛑 eatery.ch is down
In 7a36efe, eatery.ch (https://www.eatery.ch/) was down:
HTTP code: 502
Response time: 953 ms
Resolved: eatery.ch is back up in d942ae4.
| gharchive/issue | 2023-04-02T13:12:25 | 2025-04-01T06:36:40.179001 | {
"authors": [
"ASchmidt1024"
],
"repo": "ASchmidt1024/uptime",
"url": "https://github.com/ASchmidt1024/uptime/issues/21",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1111819444 | 我之前安装过,然后图片无法全屏就卸载了,但是图片却没法去掉。
然后我现在看可以全屏了,就安装了一下,新图片确实全屏了,但是和以前的图片重叠在一起
你之前的图片应该是使用这个扩展添加上去的--backgound。
解决方法:你尝试再安装之前的这个扩展把图片禁用\关闭,之后再去卸载
| gharchive/issue | 2022-01-23T09:28:40 | 2025-04-01T06:36:40.180220 | {
"authors": [
"AShujiao",
"tomuGo"
],
"repo": "AShujiao/vscode-background-cover",
"url": "https://github.com/AShujiao/vscode-background-cover/issues/75",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
510429747 | Highly active tP noted -- Kudos!
Guys, it looks like your tP is maintaining a high-level of coding activities so far and all members are actively contributing (as per the tP Dashboard) :+1:
Note: hope you can work together to help less-active team members, if they need help.
Keep up the good work!
noted 👍
| gharchive/issue | 2019-10-22T04:58:18 | 2025-04-01T06:36:40.197695 | {
"authors": [
"ang-zeyu",
"damithc"
],
"repo": "AY1920S1-CS2103T-T11-4/main",
"url": "https://github.com/AY1920S1-CS2103T-T11-4/main/issues/71",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1046728764 | Update documentation
Changes include:
Fix issues in user guide
Update user stories in developer guide
Add additional NFRs in developer guide
Add Effort appendix in developer guide
Enlarge the storage class diagram
Readjusted the model component in the UI class diagram
Add PPP
Update Documentation.md to remove traces of AB-3
Update index.md to remove traces of address book
Codecov Report
Merging #286 (ca45524) into master (abc7c90) will not change coverage.
The diff coverage is n/a.
@@ Coverage Diff @@
## master #286 +/- ##
=========================================
Coverage 76.94% 76.94%
Complexity 998 998
=========================================
Files 140 140
Lines 2646 2646
Branches 356 356
=========================================
Hits 2036 2036
Misses 530 530
Partials 80 80
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update abc7c90...ca45524. Read the comment docs.
LGTM
| gharchive/pull-request | 2021-11-07T12:01:41 | 2025-04-01T06:36:40.213491 | {
"authors": [
"codecov-commenter",
"g4ryy",
"nniiggeell"
],
"repo": "AY2122S1-CS2103-T16-3/tp",
"url": "https://github.com/AY2122S1-CS2103-T16-3/tp/pull/286",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1046679827 | Fix bold
Fixes #312
Codecov Report
Merging #314 (3d7a044) into master (122cdc2) will not change coverage.
The diff coverage is n/a.
@@ Coverage Diff @@
## master #314 +/- ##
=========================================
Coverage 75.97% 75.97%
Complexity 683 683
=========================================
Files 94 94
Lines 1998 1998
Branches 223 223
=========================================
Hits 1518 1518
Misses 414 414
Partials 66 66
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 122cdc2...3d7a044. Read the comment docs.
| gharchive/pull-request | 2021-11-07T07:24:13 | 2025-04-01T06:36:40.220176 | {
"authors": [
"codecov-commenter",
"yeo-yiheng"
],
"repo": "AY2122S1-CS2103T-T13-2/tp",
"url": "https://github.com/AY2122S1-CS2103T-T13-2/tp/pull/314",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1031442293 | Update functionality of finding people
IMPORTANT (Can skip reviewing files changed first)
The functionality of find has been drastically changed to include flags for finding based on any detail (e.g. find -n for names). It will not work without flags but I am open to changing back to the original way of finding by name without a flag. Also, the keywords for find must be spaced apart, otherwise, it is taken as one keyword
It will no longer be possible to find multiple people at once just by matching any of the keywords for this implementation so "find -n Alex Bernice" will not give any results for the basic list because "Alex Bernice" does not match any name as an abbreviation. However, if the keywords match the names of multiple people, it would still return more than one person. There would probably need to be a certain character or flag to specify finding people with other abbreviations (which I think is unnecessary since there are tags and tasks for grouping but I would be open to making this option possible).
Also, the "subsequence" is relative to each part of a person's detail (e.g. name). So, an example would be "Harry James Potter" and "Harry J Potter" or "h P" would work because there is some part of the name that starts with one of the keywords and the order of the keywords relative to the name is the same, regardless of case. For this reasons, "a J P" would not be valid. It also preserves the functionality of finding by full name.
Let me know if the sequence is supposed to be one keyword and just match the names by each character in the subsequence and have the same order as the different parts of the name (e.g. ay -> alex yeoh) because I think a pure subsequence to match any part of a person's name could end up in matching many people's names that should not have matched unless it is case-sensitive (e.g. ex -> alex yeoh). Its also consistent with the way I implemented the find function for other details like tags and tasks because I think one word as a subsequence for many tags or tasks is not ideal. So, this is why I implemented finding by names this way but I could see a special case for names
Also, I'm not sure if everything needs to be abbreviated, like phone number, so do let me know as well.
Less important
However, this order that I've been using does not matter for tags or tasks since it is not as important as for the other details, though the keywords will still have to all match someone's tags or tasks, following the idea for finding by name (or its abbreviation).
LGTM. Might want to consider adding for description too.
Some descriptions might be long and random so I think it might be doable but I I think I would have to go back to full word matches to reduce the number of matches. If everyone thinks its ok, then I'll implement it.
LGTM. I think how the keywords are matched are similar to IntelliJ's search which I think is fine.
| gharchive/pull-request | 2021-10-20T13:54:22 | 2025-04-01T06:36:40.225308 | {
"authors": [
"kflim",
"limzk126"
],
"repo": "AY2122S1-CS2103T-W10-1/tp",
"url": "https://github.com/AY2122S1-CS2103T-W10-1/tp/pull/72",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1192751218 | Update PPP - Hao Wei
Changes
This PR amends the PPP by adding recent contributions to the program as well as standardizing the program description.
Related Issues
None.
Codecov Report
Merging #322 (f74f3c0) into master (ed8ca54) will not change coverage.
The diff coverage is n/a.
@@ Coverage Diff @@
## master #322 +/- ##
=========================================
Coverage 72.28% 72.28%
Complexity 1247 1247
=========================================
Files 160 160
Lines 3842 3842
Branches 452 452
=========================================
Hits 2777 2777
Misses 985 985
Partials 80 80
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update ed8ca54...f74f3c0. Read the comment docs.
| gharchive/pull-request | 2022-04-05T07:38:49 | 2025-04-01T06:36:40.269556 | {
"authors": [
"KwanHW",
"codecov-commenter"
],
"repo": "AY2122S2-CS2103-W17-1/tp",
"url": "https://github.com/AY2122S2-CS2103-W17-1/tp/pull/322",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1423729718 | Enhance new meeting parser
NOTE:
This is a major change that affects how CreateMeetingCommand and CreateMeetingCommandParser function, although the expected outcomes are the same. Need to re-write the corresponding tests.
What?
Modified CreateMeetingCommandParser such that it handles more of the validation of the user input instead of letting CreateMeetingCommand and Meeting handle the validation.
Why?
Previously most of the validation takes place in the CreateMeetingCommand and Meeting classes -- in CreateMeetingCommandParser, the parse(arguments) method only checks whether the arguments are empty, while passing the trimmed input arguments wholesale to the CreateMeetingCommand class for processing, which defeats the purpose of a parser. Thus it is preferable to restructure the code such that CreateMeetingCommandParser adheres to the necessary functionalities of a parser. Refer to #68
How?
I have moved the splitting of the trimmed user input (names of Person(s) to meet, meeting description, meeting date and time, meeting location) and the validation of date and time [essentially processes that do not require the model] forward to CreateMeetingCommandParser, leaving CreateMeetingCommand to use the model to handle the validation of the person/ contact, as well as the creation and validation of the new Meeting. Meeting will take in the information of the new meeting passed in as the arguments of its constructor without further validation.
Codecov Report
Base: 67.82% // Head: 62.18% // Decreases project coverage by -5.64% :warning:
Coverage data is based on head (ba0370d) compared to base (057cbb3).
Patch coverage: 38.98% of modified lines in pull request are covered.
Additional details and impacted files
@@ Coverage Diff @@
## master #69 +/- ##
============================================
- Coverage 67.82% 62.18% -5.65%
+ Complexity 556 518 -38
============================================
Files 102 105 +3
Lines 1930 1962 +32
Branches 209 215 +6
============================================
- Hits 1309 1220 -89
- Misses 564 677 +113
- Partials 57 65 +8
Impacted Files
Coverage Δ
...u/address/logic/commands/DeleteMeetingCommand.java
0.00% <0.00%> (ø)
.../seedu/address/logic/parser/AddressBookParser.java
62.96% <0.00%> (-2.43%)
:arrow_down:
...dress/logic/parser/DeleteMeetingCommandParser.java
0.00% <0.00%> (ø)
src/main/java/seedu/address/model/Model.java
100.00% <ø> (ø)
...rc/main/java/seedu/address/model/ModelManager.java
74.07% <0.00%> (-9.88%)
:arrow_down:
...el/meeting/exceptions/ImpreciseMatchException.java
0.00% <0.00%> (ø)
...main/java/seedu/address/model/meeting/Meeting.java
71.05% <42.85%> (-8.70%)
:arrow_down:
...u/address/logic/commands/CreateMeetingCommand.java
17.64% <45.45%> (-76.95%)
:arrow_down:
...dress/logic/parser/CreateMeetingCommandParser.java
82.60% <77.77%> (-17.40%)
:arrow_down:
...edu/address/logic/commands/EditMeetingCommand.java
57.14% <100.00%> (-38.10%)
:arrow_down:
... and 11 more
Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.
:umbrella: View full report at Codecov.
:loudspeaker: Do you have feedback about the report comment? Let us know in this issue.
| gharchive/pull-request | 2022-10-26T09:37:24 | 2025-04-01T06:36:40.288161 | {
"authors": [
"HakkaNgin",
"codecov-commenter"
],
"repo": "AY2223S1-CS2103-F13-3/tp",
"url": "https://github.com/AY2223S1-CS2103-F13-3/tp/pull/69",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1412782527 | Added Edit Record Functionality
Added EditRecordCommand, EditRecordCommandParser for editing of a patient's records
Added test classes for EditRecordCommand, EditRecordCommandParser
Updated test class for DeleteRecordCommandParser
Added Exception classes for duplicate or missing Record
Enabled assertions in build.gradle
Renamed Record commands
Resolves #73 , #75
Codecov Report
Merging #79 (2510761) into master (f0fa370) will increase coverage by 0.94%.
The diff coverage is 73.83%.
@@ Coverage Diff @@
## master #79 +/- ##
============================================
+ Coverage 69.85% 70.80% +0.94%
- Complexity 535 567 +32
============================================
Files 94 98 +4
Lines 1712 1819 +107
Branches 172 196 +24
============================================
+ Hits 1196 1288 +92
+ Misses 465 463 -2
- Partials 51 68 +17
Impacted Files
Coverage Δ
.../java/seedu/address/logic/commands/AddCommand.java
85.71% <ø> (ø)
...seedu/address/logic/commands/AddRecordCommand.java
86.66% <ø> (ø)
...edu/address/logic/commands/ClearRecordCommand.java
100.00% <ø> (ø)
...du/address/logic/commands/DeleteRecordCommand.java
85.71% <ø> (ø)
...eedu/address/logic/commands/FindRecordCommand.java
81.81% <ø> (ø)
...eedu/address/logic/commands/ListRecordCommand.java
100.00% <ø> (ø)
.../seedu/address/logic/parser/AddressBookParser.java
68.00% <0.00%> (-2.84%)
:arrow_down:
src/main/java/seedu/address/model/Model.java
100.00% <ø> (ø)
...el/person/exceptions/DuplicateRecordException.java
0.00% <0.00%> (ø)
...del/person/exceptions/RecordNotFoundException.java
0.00% <0.00%> (ø)
... and 9 more
:mega: We’re building smart automated test selection to slash your CI/CD build times. Learn more
| gharchive/pull-request | 2022-10-18T08:21:52 | 2025-04-01T06:36:40.304366 | {
"authors": [
"anthonyhoth",
"codecov-commenter"
],
"repo": "AY2223S1-CS2103T-T14-3/tp",
"url": "https://github.com/AY2223S1-CS2103T-T14-3/tp/pull/79",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1976894342 | [PE-D][Tester B] Add command allows non numeric characters for phone number
The add command allows non numeric characthers entered for the phone number.
Command Used: add n/Test2 p/123/// e/[email protected]
Labels: severity.Medium type.FunctionalityBug
original: nicholascher/ped#8
Duplicate of #166.
| gharchive/issue | 2023-11-03T20:33:19 | 2025-04-01T06:36:40.307100 | {
"authors": [
"Cloud7050",
"nus-pe-script"
],
"repo": "AY2324S1-CS2103-W14-3/tp",
"url": "https://github.com/AY2324S1-CS2103-W14-3/tp/issues/170",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1953700431 | Combine sample data, examples into ProductionData
There are some classes that handle sample data that end users will see, such as SampleContactsUtil. There are also a fair few locations with help text that include many arbitrary strings of valid data. We could combine these into some form of ProductionData file, similar to how we now have a TestData. Note that such a file should likely be in main/ and not test/.
The class may contain mostly valid data, but in case invalid data is needed, it could contain that too. Then, we could import these values into TestData for use in our actual tests, replacing any existing values that serve the same purpose (it may turn out to be good to ensure the values we show to users are actually valid/invalid).
This may be considered a subissue of #76.
Nice to have, but won't negatively impact grading if not done. Culling this issue.
| gharchive/issue | 2023-10-20T07:26:24 | 2025-04-01T06:36:40.309455 | {
"authors": [
"Cloud7050"
],
"repo": "AY2324S1-CS2103-W14-3/tp",
"url": "https://github.com/AY2324S1-CS2103-W14-3/tp/issues/95",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1956227436 | A revamped GUI for the existing UI
Note: The whole UI will be done by v1.3b.
And Also revamping success messages etc
| gharchive/issue | 2023-10-23T02:23:06 | 2025-04-01T06:36:40.310329 | {
"authors": [
"LimJH2002",
"lyuanww"
],
"repo": "AY2324S1-CS2103T-F08-1/tp",
"url": "https://github.com/AY2324S1-CS2103T-F08-1/tp/issues/73",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1958724289 | add PersonCreator class
Fixes #63 and #64
Should we just name it as PersonBuilder? I think that's the convention for this builder pattern.
What do you guys think about following the pattern described here.
The Person constructor will be private, and PersonBuilder will be a static nested class inside Person (so that it has access to the Person constructor). Any Person object will be created only through PersonBuilder. Optional attributes can be added by calling the corresponding with... method. The final collection of fields will then be used to create a Person using the build method.
Good idea! I will reference this in another issue.
| gharchive/pull-request | 2023-10-24T08:04:06 | 2025-04-01T06:36:40.313192 | {
"authors": [
"dickongwd",
"lilozz2",
"zhyuhan"
],
"repo": "AY2324S1-CS2103T-F11-4/tp",
"url": "https://github.com/AY2324S1-CS2103T-F11-4/tp/pull/68",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1976695659 | [PE-D][Tester B] Improvement on the warning message returned
Command that I input: add /nMary p/11111 c/1111 t/1233
Response: Invalid command format!
add: Adds a person to the address book. Parameters: n/NAME p/PHONE [e/EMAIL] [a/ADDRESS] [th/TELEHANDLE] [t/TAG]... [c/COURSE]...
Example: add n/John Doe p/98765432 e/[email protected] a/311, Clementi Ave 2, #02-25 th/@Johnnnnyyy t/Friend c/CS2100 c/CS2103T c/IS1108
Perhaps try to specify which type of input the user input in an invalid format would be more useful. Currently, I am not sure about whether the input type of phone, course or tag is wrong in the case provided above.
Labels: severity.Low type.DocumentationBug
original: NgChunMan/ped#1
Note taken on feature suggestion.
| gharchive/issue | 2023-11-03T18:04:45 | 2025-04-01T06:36:40.316362 | {
"authors": [
"Carlintyj",
"nus-pe-script"
],
"repo": "AY2324S1-CS2103T-T17-4/tp",
"url": "https://github.com/AY2324S1-CS2103T-T17-4/tp/issues/174",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1920683794 | Create Project Portfolio Page
Create skeletal Project Portfolio Page (PPP)
Completed [#40]
| gharchive/issue | 2023-10-01T10:14:00 | 2025-04-01T06:36:40.317218 | {
"authors": [
"GSgiansen",
"jinyuan0425"
],
"repo": "AY2324S1-CS2103T-W17-4/tp",
"url": "https://github.com/AY2324S1-CS2103T-W17-4/tp/issues/37",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2629368506 | AddClaim unable to take in parameters in any order
Should we implement AddClaim such that parameters can be in any order (as stated in the original user guide)? There is a contradiction between the user guide and AddClaim parser.
When order of parameters is changed, the wrong error message is shown as well.
Possible solution 1: Remove the point on "Parameters can be in any order" from the user guide
Possible solution 2: Implement ability to take in parameters in any order for AddClaim
I personally think solution 1 would be better
i think solution 1 is also best, but if u wanna explore solution 2, maybe we can see how it is done in the "Add" command
i fixed it as such in my most recent PR
| gharchive/issue | 2024-11-01T16:15:57 | 2025-04-01T06:36:40.320008 | {
"authors": [
"RezwanAhmed123",
"zi-yii"
],
"repo": "AY2425S1-CS2103-F12-1/tp",
"url": "https://github.com/AY2425S1-CS2103-F12-1/tp/issues/182",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2618780541 | [Alek] ummatch [non-natural num] [non-natural num] feedback not specific
I think this is not really a format issue?
This error message is not specific that the issue is with the index
| gharchive/issue | 2024-10-28T15:34:45 | 2025-04-01T06:36:40.322086 | {
"authors": [
"KengHian",
"Quasant"
],
"repo": "AY2425S1-CS2103-F13-4/tp",
"url": "https://github.com/AY2425S1-CS2103-F13-4/tp/issues/148",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
End of preview. Expand
in Data Studio
README.md exists but content is empty.
- Downloads last month
- 141