Conformance 2.2: Added New Tests Specified by Test-Pack.#1586
Conformance 2.2: Added New Tests Specified by Test-Pack.#1586
Conversation
kriswest
left a comment
There was a problem hiding this comment.
Please don't disable the application of prettier to the codebase as this will cause very large diffs whenever the code generation is run, unless prettier is manually run afterwards. Please provide more detail on what issues it has caused for you and lets work out a solution.
|
Is there a github actions to ensure that prettier has been run? WE should run lint (without fixes) to ensure that no code gets into the repo that does not pass linting. As I said you can't rely on husky at all. It does not run from visual studio code when you do a commit (at least it didn't, I am not aware of any fixes for this). I assume that most people contributing would use VS code? We should probably add a recommended extension setting so that people install the vscode prettier plugin so that changes are made whenever you save a file. |
|
If the huskey precommit hook isn't working using that shell script, then we can instead configure it in the package.json file by adding: "husky": {
"hooks": {
"pre-commit": "npx lint-staged"
}
}That might provide better cross-platform support. Original set-up instructions I used to (restore prettier use in the repo after refactor) came from prettier: https://prettier.io/docs/install#git-hooks |
…e new event listeners in intent-k app
|
This is passing for me now. I suggest we get @julianna-ciq to engage with the interop.io guys on testing against this. Giles, I presume you're able to run against your stuff too, right? WDYT? |
|
I'll leave you to figure out what to do about husky / pre-commit checks. |
|
I have forwarded this PR to our engineers. I'll follow up with you when I have new information, @robmoffat . |
|
I had a go at running this against our implementation and it did seem to correctly run the tests up to a point. As last time we tried running we got to a certain point and then the tests stopped running. For this test run it got to 93% and got no further. The last test to run was:
However I also tried to run against the reference implementation and I got lots of failures and an imncomplete test there as well: I got quite a few:
but also other timeouts of 5000ms, 20000ms and various assertion errors and AppTimeout. I don't think that there is a report that I can copy and paste to show all the results. |
|
this is a long shot, but is popup blocking turned off?
…On Mon, May 26, 2025 at 9:21 AM Roaders ***@***.***> wrote:
*Roaders* left a comment (finos/FDC3#1586)
<#1586 (comment)>
I had a go at running this against our implementation and it did seem to
correctly run the tests up to a point. As last time we tried running we got
to a certain point and then the tests stopped running. For this test run it
got to 93% and got no further. The last test to run was:
"after each" hook: afterEach for "(2.0-ACBasicUsage1) Should receive
context when app a adds a listener and app B broadcasts to the same app
channel"
However I also tried to run against the reference implementation and I got
lots of failures and an imncomplete test there as well:
image.png (view on web)
<https://github.com/user-attachments/assets/58e6b5ab-5f40-4114-a8d1-ea67f979e444>
I got quite a few:
Error: App didn't return close context within .5 secs
but also other timeouts of 5000ms, 20000ms and various assertion errors
and AppTimeout. I don't think that there is a report that I can copy and
paste to show all the results.
—
Reply to this email directly, view it on GitHub
<#1586 (comment)>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAEK2YKPK47IFF2KBLBMBED3ALFKNAVCNFSM6AAAAAB34AX34GVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDSMBYHEZTKMBVGU>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
Well this is embarrasing.... Yes that was the problem. I'd been checking the conformance test window for the popup blocked but of course that appears in the DA window. When I fixed that all the tests ran and passed. I did get to 143% though!!! 🙂 |
2 similar comments
@robmoffat how do you feel about merging this before #1726? I've got a further set of changes to apply to #1726 where I've removed every 2.0 etc. (not just from the names, but filenames, test names etc.) to align fully with the 'just maintain the conformance tests against the current version and use tags for older versions' plan. It touches a lot of files and it might be easiest to make sure this tests up, then get the renaming PR in next? |
| * channel. | ||
| */ | ||
| newChannelId: null | string; | ||
| currentChannelId?: null | string; |
There was a problem hiding this comment.
@robmoffat this is probably the thing that broke the typescript build - quicktype is making the two possible fields both optional because it can't express in TypeScript that one or other must be set (as JSONSchema can). Will need handling in the code
kriswest
left a comment
There was a problem hiding this comment.
Super close to done. I'm still nit-picking the changelogs and the FDC3Event type needs its type field back (it being string was likely my mistake/brainfart, sorry).
I'll spend a few mins trying to run this against the reference implementation now.
|
Its the Channels tests that are throwing the progress % off - if you run them separately progress isn't counted. There are 25 channel tests (of 84 total) and 25 / (84-25) = 43% (which is why it counts to 143%). Now to figure how to fix it... |
…test suites are assembled The channel tests were being assembled asynchronously, but not being awaited properly leading to their assembly happening after the runner had started, causing it to miscalculate the progress %
|
@robmoffat got the fix for the progress reporting on #1748 which you can merge into this. Fix the changelog and the FDC3Event type and I think this is ready to merge (and we can do the rename PR afterwards) |
fix the progress reporting in conformance tests
1 similar comment
|
Still looking good after your PR @kriswest
|
|
over to you @kriswest |
kriswest
left a comment
There was a problem hiding this comment.
Need to restore the section heading in the changelog, otherwise I'm ready to approve and merge @robmoffat. I'll raise an issue for checking the returned FDC3 version number.
| const getAgent2_2 = (fdc3: DesktopAgent, documentation: string) => { | ||
| it('(GetAgentAPI) Method is callable', async () => { | ||
| const info = await fdc3.getInfo(); | ||
| assert.isTrue(info.fdc3Version.startsWith('2.'), documentation); |
There was a problem hiding this comment.
I still think we should check specific version (you need to report the matching version number of the test set and to use the test set for the right version...). We can pick up the version number from the fdc3-standard - but we can pick that up in a subsequent PR.





payload.channelIdshould benull. #1611Describe your change
Adds two new FDC3 2.2 conformance tests per test-pack specification and makes conformance framework deployment-aware for netlify previews, localhost, and production.
Changes
Conformance Tests
ChannelChangedEventtest verifying event listener behaviorDeployment Infrastructure
replaceConformanceUrls.jsscript for dynamic URL injectionpreview-conformance.v2.jsonandwebsite-conformance.v2.jsonapp directoriesFixes
WCP1Helloschema property name inHelloHandlerDependencies
Related Issue
Resolves #1455, #1611
Raises #1585
Contributor License Agreement