-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Standard output from workers gets corrupted, consider sending over IPC instead #1758
Comments
I don't think this is a transpiling issue. Since tests run in child processes it might just be hard to forward standard output reliably, especially if there's several kilobytes of it. Perhaps you could give #1722 a try? It has better process communication and writes all standard output from the test files to Data may still get cut up into multiple chunks though. I've considered capturing all standard output and sending it through the process communication channel, which should be more consistent, but didn't think it was worth it. It might solve your use case so perhaps it's something to consider. (I'm closing this issue for housekeeping purposes, for now, but let's keep the conversation going.) |
Thanks @novemberborn - I've whittled things down a bit, and created a test case for you: AVA iTerm2 - No 1337 IMG Supporthttps://github.com/F1LT3R/ava-iterm2-no-1337-img I'm happy to try #1722 if you can give me instructions on how to do this? If it's complicated, perhaps you might find it easier to run my test case than explaining it to me. :) |
Cheers for the test case @F1LT3R. This is not working with #1722. I'll reopen this issue, and I am tempted to make all worker->AVA communication go over IPC, rather than having some go over stdout / stderr, but no guarantees on whether that'll actually happen. For the time being it seems |
Thanks, @novemberborn - makes sense. Good to know about t.log, I will try that out. |
The things that makes this really tricky @novemberborn - I'm writing an image diff library to run in AVA, so you get image diffs in the CLI as colored ANSI output. Logging via t requires passing t to my library, which feels really weird. I wouldn't want the user to have to pass t.log into the function that calls my library just so they could see the correct visual output. Hope that makes sense. What works is if I pass in
This also makes my test look like this... test('Scorecard grid', async t => {
const img1 = 'fixtures/green-circle.jpg'
const img2 = 'fixtures/green-circle.png'
const opts = {
grid: {
columns: 8,
rows: 8
},
tolerance: {
hue: 1,
sat: 20,
lum: 0.2,
alp: 0
},
display: {
images: 32
},
$MODE: 'CLI',
// Can't add T here (merging kills the function)
t
}
// I'm forced to pass t in as a param
// passing in t here -----------\ /
const result = await fuzi(img1, img2, opts, t)
t.true(result.fail)
}) Then inside my library: if (t) {
t.log(ansiImgDiff)
} This is far from ideal. I would have to ask the user to pass This is output that should only be being seen in CI. Putting some kind of "if it fails, then t.log(all the image diff info)" into each test, would be so cumbersome. At the moment this is making me wish I'd stuck with Mocha, but I'm hoping I can find a workaround because I prefer AVA. |
Is this something I could do? Can you point me in the right direction @novemberborn ? |
I'm hoping to try this in #1722. You're welcome to have a poke around as well. Perhaps start with the reporters and work your way back to |
Ok thanks @novemberborn |
In case this helps anyone:
The least painful workaround I have for now is to add a reporter option to my tests: test('My img diff test', async t => {
const img2 = 'fixtures/green-circle.jpg'
const img1 = 'fixtures/green-circle.png'
const opts = {
tolerance: {
hue: 1,
sat: 20,
lum: 0.2,
alp: 0
},
// PASS IN A REPORTER CALLBACK OPTION HERE
reporter: err => {
t.log(err)
}
}
const result = await fuzi(img1, img2, opts)
t.true(result.fail) Then I can pass this back into the context of It appears that |
It does, yea.
Yea that's one concern, but I don't think raw |
Is there a trick to get Even though https://travis-ci.org/F1LT3R/fuzi/builds/368486782 Locally with |
@F1LT3R that's likely your CI not handling the image output. I don't think that's related to AVA. |
It's not images being output. It's ANSI characters. From what I'm seeing, IPC might not be doing what you expect on Linux. (Assuming you would expect 0 corruptions over IPC.) Note: even if I remove all ANSI escape sequences from the |
Sure, was just trying to say that it might depend on your CI understanding the sequences. I think Node.js serializes the IPC communication so it shouldn't get chopped up. Note that |
I decided to land #1722 without trying this. I still want to look into this, though if anybody else wants to give it a try please give a shout. |
Description
AVA seems to sometimes garble node-canvas dataurls that I stdout back out to the console. This behavior is reproducible, but slight tweaks to the values make it work/not-work. Very odd. I spent the past few days thinking Node-Canvas was buggy. Now I think AVA might be doing some transpiling or character replacement under the hood that's throwing things off perhaps?
Test Source
This output is what I expect to see in my console after running the code...
Here is that code:
But here is what I get if I wrap that same code in an AVA test:
I get a DATA URL back, but the console log seems to get destroyed. Lines get deleted upwards and such.
Here is that code wrapped in a test. Apologies for the repetition.
Here is where things get super-weird.
If I change the background color of the canvas on line 26...
... now I see the logged image.
Remember, this weird behavior does not happen without AVA.
Is this a transpiling issue?
It's a bit frustrating that I can't use AVA to run canvas tests and see the results in the CLI. As a work-around I can probably export the dataurl to a file for now.
Error Message & Stack Trace
N/A
Config
Copy the relevant section from
package.json
:Command-Line Arguments
Copy your npm build scripts or the
ava
command used:Environment
The text was updated successfully, but these errors were encountered: