Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: show unclaimed permits #172

Conversation

gentlementlegen
Copy link
Member

@gentlementlegen gentlementlegen marked this pull request as ready for review February 22, 2024 03:00
@ubiquity-os-deployer
Copy link

ubiquity-os-deployer bot commented Feb 22, 2024

@rndquu rndquu self-requested a review February 22, 2024 09:36
@0x4007
Copy link
Member

0x4007 commented Feb 22, 2024

2a3fe2a

I broke the RPC thing and pushed to development (sorry about this! I've been in the middle of @ubiquity/cloudflare-deploy-action, @ubiquity/ts-template, and @ubiquity/pay.ubq.fi for all for interoperability for the continuous deployments)

For a quick review you might be able to pull from mine: #175

static/scripts/audit-report/audit.ts Outdated Show resolved Hide resolved
export function shortenTransactionHash(hash: string | undefined, length = 8): string {
if (!hash) return "";
if (!hash || hash === TX_EMPTY_VALUE) return "UNCLAIMED";
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Don't display anything to the user.

Copy link
Member

@rndquu rndquu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here is the audit result for the https://github.com/ubiquity/business-development repo:
Screenshot 2024-02-22 at 13 01 48

This issue is missing in the final report. As far as I understand from the comments in that issue those permits were not claimed. Pls make sure that unclaimed permits are displayed.

@gentlementlegen
Copy link
Member Author

Here is the audit result for the https://github.com/ubiquity/business-development repo: Screenshot 2024-02-22 at 13 01 48

This issue is missing in the final report. As far as I understand from the comments in that issue those permits were not claimed. Pls make sure that unclaimed permits are displayed.

Thanks for your input. I cannot test without the secret key and could only try on https://github.com/ubiquity/ubiquibot please let me know if the fix works.

I changed the logic so the items are added after all operations are complete instead of having them added after rpc + db fetch are complete, which should solve the issue. Added the corresponding reset as well.

@0x4007
Copy link
Member

0x4007 commented Feb 22, 2024

I cannot test without the secret key

What secret key

@gentlementlegen
Copy link
Member Author

I cannot test without the secret key

What secret key

Bot Wallet Address I meant, I think it is required to get the results from a certain repo?

@0x4007
Copy link
Member

0x4007 commented Feb 22, 2024

I cannot test without the secret key

What secret key

Bot Wallet Address I meant, I think it is required to get the results from a certain repo?

Results are publicly posted on the blockchain and on GitHub comments. No private key required.

@gentlementlegen
Copy link
Member Author

I cannot test without the secret key

What secret key

Bot Wallet Address I meant, I think it is required to get the results from a certain repo?

Results are publicly posted on the blockchain and on GitHub comments. No private key required.

You are correct, but since part of the results come from the blockchain, I do not know the Bot Wallet Address, but I realize that it is actually the address I see whenever transactions are made, so I could figure it out myself, my bad

@0x4007 0x4007 removed their request for review February 22, 2024 22:33
@0x4007
Copy link
Member

0x4007 commented Feb 22, 2024

I've got a few things on my plate and wont be able to handle this review. @rndquu hope that you can get this resolved. Also as a heads up @FernandVEYRIER as soon as this is merged in I would like to get this into another repo

@rndquu
Copy link
Member

rndquu commented Feb 22, 2024

@FernandVEYRIER

I cannot test without the secret key

You can QA the audit page this way:

  1. Create a new github personal access token
  2. Use this quick import for testing:
 {
  "WALLET": "0xf87ca4583C792212e52720d127E7E0A38B818aD1",
  "REPO": "https://github.com/ubiquity/business-development",
  "PAT": "YOUR_PAT"
 }

Could you make sure that ubiquity/business-development#67 is displayed in the audit results when running from the latest commit in this PR?

@gentlementlegen
Copy link
Member Author

@FernandVEYRIER

I cannot test without the secret key

You can QA the audit page this way:

  1. Create a new github personal access token
  2. Use this quick import for testing:
 {
  "WALLET": "0xf87ca4583C792212e52720d127E7E0A38B818aD1",
  "REPO": "https://github.com/ubiquity/business-development",
  "PAT": "YOUR_PAT"
 }

Could you make sure that ubiquity/business-development#67 is displayed in the audit results when running from the latest commit in this PR?

@rndquu thank you. I was trying that way before, and I never get any result, which is why I started questioning about the WALLET value. Here is the result of the query
image

I am not aware if there is any API limiter, or if I am missing some values in my env or something wrong with my setup in general. I am also using the database with id wfzpewmlyiozupulbuur which contains 0 permits for that repo. I will investigate in the meantime.

@gentlementlegen
Copy link
Member Author

gentlementlegen commented Feb 23, 2024

@rndquu After investigation, there seemed to be something wrong within the local database (browser's) that was saved, so after clearing the database the fetching works again so far. But now that we don't read the GitHub issues anymore, in order to be displayed the items have to be saved somewhere in the Supabase db, are you using another db than the one I have right now?

Come to think of it, we shouldn't need to fetch the transactions from etherscan.io and gnosisscan.io anymore since all of this information should be saved in the database in the first place isn't it? I think it should all be rewritten to solely rely on the db's values as a source of truth, that would drastically speed up the fetching process and not rely on querying the ether / gnosis networks.

@0x4007
Copy link
Member

0x4007 commented Feb 23, 2024

are you using another db than the one I have right now?

No there's just one database we are only really using for saving wallets right now.

Yes we are in a transition period but the idea is to query all the necessary information from our database. The thing is, we are unsure how long it will take to finalize the database because the infrastructure is evolving so we need to change the database schema to accommodate.

Fortunately with the plugin infrastructure we can isolate tables per plugin to make things more adaptable in our rapid r&d phase.

@gentlementlegen
Copy link
Member Author

Made significant changes to accommodate for the changes related to how the permits are fetched. Now the DB should be the only source of truth, holding transactions, amounts and users. This allows for a much better performance, since we do not need to fetch the transactions from the chains anymore, and can just rely on a one DB call to get all the needed data. We still need GitHub to fetch the user and issue details as they are not saved in the db.

@0x4007
Copy link
Member

0x4007 commented Feb 25, 2024

@rndquu can you handle this review

@rndquu
Copy link
Member

rndquu commented Feb 26, 2024

@FernandVEYRIER Why is the final report empty?

Screenshot 2024-02-26 at 17 53 25

@0x4007 0x4007 marked this pull request as draft February 27, 2024 04:17
@gentlementlegen
Copy link
Member Author

@rndquu there is no record in the database for business-development repo, hance the empty result. Now all the data is pulled out from the database, from the transaction to wallets to permits, because I believe this is the way it should be done, or at least in a near future. The db should be holding these informations without us needing to parse each gnosis / ether transaction. You can give a try with ubiquibot repositories which I added for testing.

Eventually maybe we should consider a script that would help us populate the db with the previous transactions and users that are missing.

@0x4007
Copy link
Member

0x4007 commented Feb 29, 2024

Remarks on database schema:

Although I appreciate you taking the time to understand the relationships between each table, for the sake of modularity, I think it makes the most sense to have a separate table/db per plugin, which means that a full understanding of the relationships across all tables is no longer necessary. Instead the developer only needs to concern themselves with what plugin interacts with the data that's relevant to the task at hand.

Locations table idea should be completely removed. The idea here was to keep track of where any activity happened for auditability purposes. The problem is that it introduced significant complexity and had minimal benefit. Instead we should just add a new column on every table for the URL of the entity we are referring to (e.g. a GitHub comment, an issue, a user, a pull request etc) it is slightly space inefficient but makes working with the database significantly easier (and still allows for easy auditing)

The finance related stuff (permits, settlements, credits, debits) should be handled only by the permit generation plugin.

And so on, the other tables should have specific plugins they are associated with.

The only exception I potentially see: logs are nice to have for the kernel actually, unless cloudflare has a built in solution that is good (we built this cause netlify wasn't great) if cloudflare has a good and free solution then perhaps we should migrate to that.

The problem working with urls for audit is that it implies parsing the page like now for comments, urls, prs and so on, which is very inefficient and unreliable (if tomorrow GitHub decides to change one div name it breaks everything) so I think the database on that regard was beneficial. It speeds up the process significantly and allows for scaling because the more issues there are the slower it would be parsing them. I agree this is a case by case thing, but for example for this repo using a db instead of cross referencing gnosis / ether / pull-requests made the code significantly easier to read and maintain. Maybe we should have a separated module only handling GitHub events with hooks and keeping records in a database? It might be a viable solution. Let me know, otherwise we can abandon this PR I guess.

The GitHub issues REST API is actually designed for the following parameters:

Organization name, repository name, issue/pull request number.

These can easily be extracted from the URL.


I assumed that if I had the comment ID, that I could directly look up the comment; but we would have to use graphql for all that. I would much, much rather we use the normal REST API for all of our infrastructure because it's significantly easier with its built in methods, and comes with type safety.

The alternative is that we have the comment ID embedded as the # parameter in the URL, then we load that issue/pr, and we can find the comment ID from there.

@gentlementlegen
Copy link
Member Author

Remarks on database schema:

Although I appreciate you taking the time to understand the relationships between each table, for the sake of modularity, I think it makes the most sense to have a separate table/db per plugin, which means that a full understanding of the relationships across all tables is no longer necessary. Instead the developer only needs to concern themselves with what plugin interacts with the data that's relevant to the task at hand.

Locations table idea should be completely removed. The idea here was to keep track of where any activity happened for auditability purposes. The problem is that it introduced significant complexity and had minimal benefit. Instead we should just add a new column on every table for the URL of the entity we are referring to (e.g. a GitHub comment, an issue, a user, a pull request etc) it is slightly space inefficient but makes working with the database significantly easier (and still allows for easy auditing)

The finance related stuff (permits, settlements, credits, debits) should be handled only by the permit generation plugin.

And so on, the other tables should have specific plugins they are associated with.

The only exception I potentially see: logs are nice to have for the kernel actually, unless cloudflare has a built in solution that is good (we built this cause netlify wasn't great) if cloudflare has a good and free solution then perhaps we should migrate to that.

The problem working with urls for audit is that it implies parsing the page like now for comments, urls, prs and so on, which is very inefficient and unreliable (if tomorrow GitHub decides to change one div name it breaks everything) so I think the database on that regard was beneficial. It speeds up the process significantly and allows for scaling because the more issues there are the slower it would be parsing them. I agree this is a case by case thing, but for example for this repo using a db instead of cross referencing gnosis / ether / pull-requests made the code significantly easier to read and maintain. Maybe we should have a separated module only handling GitHub events with hooks and keeping records in a database? It might be a viable solution. Let me know, otherwise we can abandon this PR I guess.

The GitHub issues REST API is actually designed for the following parameters:

Organization name, repository name, issue/pull request number.

These can easily be extracted from the URL.

I assumed that if I had the comment ID, that I could directly look up the comment; but we would have to use graphql for all that. I would much, much rather we use the normal REST API for all of our infrastructure because it's significantly easier with its built in methods, and comes with type safety.

The alternative is that we have the comment ID embedded as the # parameter in the URL, then we load that issue/pr, and we can find the comment ID from there.

As a personal taste, GraphQl is usually way easier to use and is even more type safe than REST APIs, since types are dynamically generated from passed arguments where in REST you can send anything. But not using the db, we would have to also call again the ether / gnosis network to check for the status of the claim isn't it?

@0x4007
Copy link
Member

0x4007 commented Feb 29, 2024

Remarks on database schema:

Although I appreciate you taking the time to understand the relationships between each table, for the sake of modularity, I think it makes the most sense to have a separate table/db per plugin, which means that a full understanding of the relationships across all tables is no longer necessary. Instead the developer only needs to concern themselves with what plugin interacts with the data that's relevant to the task at hand.

Locations table idea should be completely removed. The idea here was to keep track of where any activity happened for auditability purposes. The problem is that it introduced significant complexity and had minimal benefit. Instead we should just add a new column on every table for the URL of the entity we are referring to (e.g. a GitHub comment, an issue, a user, a pull request etc) it is slightly space inefficient but makes working with the database significantly easier (and still allows for easy auditing)

The finance related stuff (permits, settlements, credits, debits) should be handled only by the permit generation plugin.

And so on, the other tables should have specific plugins they are associated with.

The only exception I potentially see: logs are nice to have for the kernel actually, unless cloudflare has a built in solution that is good (we built this cause netlify wasn't great) if cloudflare has a good and free solution then perhaps we should migrate to that.

The problem working with urls for audit is that it implies parsing the page like now for comments, urls, prs and so on, which is very inefficient and unreliable (if tomorrow GitHub decides to change one div name it breaks everything) so I think the database on that regard was beneficial. It speeds up the process significantly and allows for scaling because the more issues there are the slower it would be parsing them. I agree this is a case by case thing, but for example for this repo using a db instead of cross referencing gnosis / ether / pull-requests made the code significantly easier to read and maintain. Maybe we should have a separated module only handling GitHub events with hooks and keeping records in a database? It might be a viable solution. Let me know, otherwise we can abandon this PR I guess.

The GitHub issues REST API is actually designed for the following parameters:

Organization name, repository name, issue/pull request number.

These can easily be extracted from the URL.

I assumed that if I had the comment ID, that I could directly look up the comment; but we would have to use graphql for all that. I would much, much rather we use the normal REST API for all of our infrastructure because it's significantly easier with its built in methods, and comes with type safety.

The alternative is that we have the comment ID embedded as the # parameter in the URL, then we load that issue/pr, and we can find the comment ID from there.

As a personal taste, GraphQl is usually way easier to use and is even more type safe than REST APIs, since types are dynamically generated from passed arguments where in REST you can send anything.

I mean REST using the @octokit/rest module. We manipulate the REST API with the provided type safe methods. I am not 100% sure about the graphql sdk but given that we pass in huge arbitrary strings it seems unlikely to me that it supports type safety.

But not using the db, we would have to also call again the ether / gnosis network to check for the status of the claim isn't it?

Yes but in my earlier research from a few days ago, in the context of pay.ubq.fi rewards interface: given the amount of functional RPC endpoints we have available to us, and given that most people claim within 12 hours of posting; we can feasibly find any specific claim status almost "realtime" at page load

I'm mobile so it's not too easy for me to find the original conversation on this.

@0x4007
Copy link
Member

0x4007 commented Feb 29, 2024

Found it #138 (comment)

@gentlementlegen
Copy link
Member Author

@pavlovcik You can simply use codegen-graphql on GitHub endpoints to automatically generate all the TypeScript interfaces 😄
To make it real time I thought we could subscrible to GitHub hooks on pull request updates, and write the db on permit claims directly so everything is real time. Problem was with the previous version of this package, on busy repositories the audit took me around 3 / 4 minutes to fetch everything.

@0x4007
Copy link
Member

0x4007 commented Feb 29, 2024

Because we don't have any test data in the database, its hard to end-to-end test this. Based on the result from pay.ubq.fi at least it shows me that it renders details.

I think we can merge in and follow up here if theres issues down the line.

Screenshot 2024-02-29 at 15 01 56

Although to be honest I'm confused why this is here. Is it just a test entry? I can't find the permit generated.

#174

 {
  "WALLET": "0x44ca15db101fd1c194467db6af0c67c6bbf4ab51",
  "REPO": "https://github.com/ubiquity/pay.ubq.fi",
  "PAT": ""
 }

@gentlementlegen
Copy link
Member Author

Because we don't have any test data in the database, its hard to end-to-end test this. Based on the result from pay.ubq.fi at least it shows me that it renders details.

I think we can merge in and follow up here if theres issues down the line.

Screenshot 2024-02-29 at 15 01 56 Although to be honest I'm confused why this is here. Is it just a test entry? I can't find the permit generated.

#174

 {
  "WALLET": "0x44ca15db101fd1c194467db6af0c67c6bbf4ab51",
  "REPO": "https://github.com/ubiquity/pay.ubq.fi",
  "PAT": ""
 }

@pavlovcik yes I manually entered this one for the sake of testing. I can create a script to populate the db with real data if needed

@0x4007
Copy link
Member

0x4007 commented Feb 29, 2024

No worries on the script. We can test the rest in production. Thank you

@rndquu
Copy link
Member

rndquu commented Feb 29, 2024

To sum up: we're good to show issues only with claimed permits because it's easier and faster to read from DB instead of parsing github or onchain data

@gentlementlegen
Copy link
Member Author

To sum up: we're good to show issues only with claimed permits because it's easier and faster to read from DB instead of parsing github or onchain data

Issues without permit should be also showing with the latest fix, as long as there is a record of that issue in the db

@rndquu
Copy link
Member

rndquu commented Feb 29, 2024

To sum up: we're good to show issues only with claimed permits because it's easier and faster to read from DB instead of parsing github or onchain data

Issues without permit should be also showing with the latest fix, as long as there is a record of that issue in the db

But there is no "issues" table in the DB, where do we get issues related info?

@gentlementlegen
Copy link
Member Author

To sum up: we're good to show issues only with claimed permits because it's easier and faster to read from DB instead of parsing github or onchain data

Issues without permit should be also showing with the latest fix, as long as there is a record of that issue in the db

But there is no "issues" table in the DB, where do we get issues related info?

In the locations there are issues. I created a view where there is a distinct on these issues to get the whole list. That reminds me that I don't know currently how migrations are handled.

@0x4007
Copy link
Member

0x4007 commented Feb 29, 2024

I really want to drop the locations table because its a liability dealing with all the foreign keys I made to it.

@gentlementlegen
Copy link
Member Author

I really want to drop the locations table because its a liability dealing with all the foreign keys I made to it.

Might want to rethink the structure of the database? We should think what would be the most efficient way to sync the data with the GitHub and the transactions.

@0x4007
Copy link
Member

0x4007 commented Feb 29, 2024

Definitely URL. At minimum it allows for instant auditability just by clicking the link on the Supabase UI.

@gentlementlegen
Copy link
Member Author

Definitely URL. At minimum it allows for instant auditability just by clicking the link on the Supabase UI.

locations have a node_url containing the url of the issues currently

Copy link
Member

@rndquu rndquu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To sum up:

  1. We're retrieving issues from DB from the locations table
  2. In the future the locations table will be removed so the audit page will have to be refactored to read issues from some other table

@FernandVEYRIER Could you reopen this PR towards the https://github.com/ubiquity/audit.ubq.fi repo?

@0x4007
Copy link
Member

0x4007 commented Mar 1, 2024

Definitely URL. At minimum it allows for instant auditability just by clicking the link on the Supabase UI.

locations have a node_url containing the url of the issues currently

I know, I implemented all of the current database schema.

We instead should have URL columns on every table that links the GitHub entity at hand for the row data.

In exchange for the redundancy we have high convenience for auditing and debugging.

@gentlementlegen
Copy link
Member Author

@rndquu Reopened the PR at ubiquity/audit.ubq.fi#2

@0x4007
Copy link
Member

0x4007 commented Mar 3, 2024

@rndquu Reopened the PR at ubiquity/audit.ubq.fi#2

You should close this pull request if its no longer necessary.

@gentlementlegen
Copy link
Member Author

@pavlovcik I resolved the merge conflict in case you decide to merge it, everything should be solved at that point. The other PR will be awaiting for reviews in the meantime

@rndquu
Copy link
Member

rndquu commented Mar 4, 2024

Closing in favour of ubiquity/audit.ubq.fi#2

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

audit dApp: Show issues with unclaimed permits
3 participants