WIP: Change go-sbotcli-rs to parse json responses instead of using regex matches #8

Draft
notplants wants to merge 4 commits from parse-json into main
Owner

As discussed on the call earlier, I think parsing json is more deterministic and easier to reason about than using regexes.

The negative is that it looks like we may need to include Serde to do this (unless I'm misreading miniserde docs, which I might be), and we also need to include some of the sunrise-choir dependencies (cool that those crates already exist though!). So this means a larger library, but saying we "cant parse json" because it makes the binary too big, is also a pretty luddite requirement to put ourselves on... not impossible... but I think we might need to accept parsing json in our required toolkit.

There also could be future advantages to orienting around json. As we start to build more complex interactions with the objects retrieved, being able to work with the objects as structs, and create, modify or filter them.

As discussed on the call earlier, I think parsing json is more deterministic and easier to reason about than using regexes. The negative is that it looks like we may need to include Serde to do this (unless I'm misreading miniserde docs, which I might be), and we also need to include some of the sunrise-choir dependencies (cool that those crates already exist though!). So this means a larger library, but saying we "cant parse json" because it makes the binary too big, is also a pretty luddite requirement to put ourselves on... not impossible... but I think we might need to accept parsing json in our required toolkit. There also could be future advantages to orienting around json. As we start to build more complex interactions with the objects retrieved, being able to work with the objects as structs, and create, modify or filter them.
notplants added 2 commits 2021-11-16 14:44:52 +00:00
notplants added 1 commit 2021-11-16 15:06:48 +00:00
notplants added 1 commit 2021-11-16 15:15:56 +00:00
notplants changed title from Change go-sbotcli-rs to parse json responses instead of using regex matches to WIP: Change go-sbotcli-rs to parse json responses instead of using regex matches 2021-11-16 15:16:23 +00:00
Author
Owner

Questions about what sbotcli calls to use to get the right filters will effect this code, but this is a basic PR going in the direction of parsing json, if we decide thats what we want to do @glyph.

So maybe not a detailed review needed. But some type of greenlight if you think json is a good direction, if there's a way to use miniserde, stick with regex, other ideas etc.

Questions about what sbotcli calls to use to get the right filters will effect this code, but this is a basic PR going in the direction of parsing json, if we decide thats what we want to do @glyph. So maybe not a detailed review needed. But some type of greenlight if you think json is a good direction, if there's a way to use miniserde, stick with regex, other ideas etc.
Owner

@notplants

Yesterday I realised we are getting ahead of ourselves here and making a blunder.

go-sbotcli-rs is intended to be a thin wrapper around the sbotcli tool. It should provide a way to invoke sbotcli commands and return the stdout or a clear error description with stderr. It should not be concerned with parsing or filtering output; no Regex, serde or miniserde.

The wrapper approach allows library users to have their own opinions about parsing, filtering and grouping commands - without having to live with our design choices.

So we should keep this repo focused and then start a separate repo which will use go-sbotcli-rs as a dependency and build on top of it. Then we can change our parsing and filtering approach over time without making any breaking changes to go-sbotcli-rs.


There also could be future advantages to orienting around json. As we start to build more complex interactions with the objects retrieved, being able to work with the objects as structs, and create, modify or filter them.

Agreed that the JSON object model provides some great conveniences for working with grouped data.

One small word of caution I'll offer here (not directly related to JSON) is to be wary of not prematurely over-engineering. We should concentrate our design choices around satisfying the feature set we've defined for the first PeachPub release. If we need to introduce complexity for additional features, we can always do that later.

@notplants Yesterday I realised we are getting ahead of ourselves here and making a blunder. `go-sbotcli-rs` is intended to be a thin wrapper around the `sbotcli` tool. It should provide a way to invoke `sbotcli` commands and return the `stdout` or a clear error description with `stderr`. It should not be concerned with parsing or filtering output; no Regex, `serde` or `miniserde`. The wrapper approach allows library users to have their own opinions about parsing, filtering and grouping commands - without having to live with our design choices. So we should keep this repo focused and then start a separate repo which will use `go-sbotcli-rs` as a dependency and build on top of it. Then we can change our parsing and filtering approach over time without making any breaking changes to `go-sbotcli-rs`. ----- > There also could be future advantages to orienting around json. As we start to build more complex interactions with the objects retrieved, being able to work with the objects as structs, and create, modify or filter them. Agreed that the JSON object model provides some great conveniences for working with grouped data. One small word of caution I'll offer here (not directly related to JSON) is to be wary of not prematurely over-engineering. We should concentrate our design choices around satisfying the feature set we've defined for the first PeachPub release. If we need to introduce complexity for additional features, we can always do that later.
Author
Owner

This feels to me like deciding whether we want one or two libraries. As the parsing still needs to live somewhere. So its a tradeoff of whether calling sbot commands and parsing to stdout is enough functionality to be its own library.

I've used some API libraries that will do parsing and return Structs, not just strings for the user to parse. But maybe there are some that separate these as well?

I think this relates to the conversation we were having about errors as well. I would ask "what something will be used for", as a way of reasoning about what it should be like.

If a library user wants to write a rust application using ssb, they might be happy to have more useful functions they can use in the library, and could always write a more low-level command if there was something the library didn't provide.

As I understand it, the way you describe,
the low-level library (go-sbotcli-rs):
Writing a one-to-one correspondence between rust commands and go commands, to parse to stdout.
the libary on top of that (sbot):
functions like set_relationship, get_relationship, get_profile_info, get_about_name, get_about_description, get_profile_image.
These are the functions we will call from peach-web.

I could still see a case for just making one library, and if there are low-level calls that get used in multiple places, abstracting them as needed. Also in the spirit of not prematurely-engineering for theoretical library users whose needs we don't know. But I don't know. Will be curious to talk to alex today about how they would go about filtering in go.

This feels to me like deciding whether we want one or two libraries. As the parsing still needs to live somewhere. So its a tradeoff of whether calling sbot commands and parsing to stdout is enough functionality to be its own library. I've used some API libraries that will do parsing and return Structs, not just strings for the user to parse. But maybe there are some that separate these as well? I think this relates to the conversation we were having about errors as well. I would ask "what something will be used for", as a way of reasoning about what it should be like. If a library user wants to write a rust application using ssb, they might be happy to have more useful functions they can use in the library, and could always write a more low-level command if there was something the library didn't provide. As I understand it, the way you describe, the low-level library (go-sbotcli-rs): Writing a one-to-one correspondence between rust commands and go commands, to parse to stdout. the libary on top of that (sbot): functions like set_relationship, get_relationship, get_profile_info, get_about_name, get_about_description, get_profile_image. These are the functions we will call from peach-web. I could still see a case for just making one library, and if there are low-level calls that get used in multiple places, abstracting them as needed. Also in the spirit of not prematurely-engineering for theoretical library users whose needs we don't know. But I don't know. Will be curious to talk to alex today about how they would go about filtering in go.
Author
Owner

Could also be one library, which includes both the low-level calls (one-to-one correspondence with sbotcli calls), as well as additional higher level calls, and the library user (such as us, in peach-web) can user either.

I see another benefit now of dividing in two libraries that maybe you were already thinking about:

  • the low-level library doesn't need serde as a dependency so its lighter for someone to include if they end up making other choices about parsing

so by dividing the libraries we allow users to have smaller binaries if needed, even if the quick start user probably just wants functions that give them the info they need

Could also be one library, which includes both the low-level calls (one-to-one correspondence with sbotcli calls), as well as additional higher level calls, and the library user (such as us, in peach-web) can user either. I see another benefit now of dividing in two libraries that maybe you were already thinking about: - the low-level library doesn't need serde as a dependency so its lighter for someone to include if they end up making other choices about parsing so by dividing the libraries we allow users to have smaller binaries if needed, even if the quick start user probably just wants functions that give them the info they need
Author
Owner

what would you call the two libraries?

what would you call the two libraries?
Owner

This feels to me like deciding whether we want one or two libraries. And would mostly be a small distinction, as the parsing still needs to live somewhere.

It's a small distinction with potentially big consequences. Of all the code we're writing for PeachCloud, go-sbotcli-rs is the most likely to be used by other developers in other projects. By separating the libraries, we can offer high-level, convenient functions which perform parsing and return structs - while still offering low-level flexibility for other tool builders.

I can't see any obvious downsides to the two library approach. Since we need to perform sbotcli calls and capture the stdout and stderr anyways, we're not building anything we don't need. We are engineering for our own needs while making it easier for other devs to benefit from our work if they want to build their own tools.

As I understand it, the way you describe,
the low-level library (go-sbotcli-rs):
Writing a one-to-one correspondence between rust commands and go commands, to parse to stdout.
the libary on top of that (sbot):
functions like set_relationship, get_relationship, get_profile_info, get_about_name, get_about_description, get_profile_image.
The functions we will call from peach-web.

Exactly. Could be go-sbotcli-rs and peach-ssb or peach-sbot.


Will be curious to talk to alex today about how they would go about filtering in go.

Yeah I'm curious about this too! Hope we can get some input from cryptix. I don't think we should design filtering solutions until we've tapped-into the wisdom of more experience Butts.

> This feels to me like deciding whether we want one or two libraries. And would mostly be a small distinction, as the parsing still needs to live somewhere. It's a small distinction with potentially big consequences. Of all the code we're writing for PeachCloud, `go-sbotcli-rs` is the most likely to be used by other developers in other projects. By separating the libraries, we can offer high-level, convenient functions which perform parsing and return structs - while still offering low-level flexibility for other tool builders. I can't see any obvious downsides to the two library approach. Since we need to perform `sbotcli` calls and capture the `stdout` and `stderr` anyways, we're not building anything we don't need. We are engineering for our own needs while making it easier for other devs to benefit from our work if they want to build their own tools. > As I understand it, the way you describe, the low-level library (go-sbotcli-rs): Writing a one-to-one correspondence between rust commands and go commands, to parse to stdout. the libary on top of that (sbot): functions like set_relationship, get_relationship, get_profile_info, get_about_name, get_about_description, get_profile_image. The functions we will call from peach-web. Exactly. Could be `go-sbotcli-rs` and `peach-ssb` or `peach-sbot`. ----- > Will be curious to talk to alex today about how they would go about filtering in go. Yeah I'm curious about this too! Hope we can get some input from cryptix. I don't think we should design filtering solutions until we've tapped-into the wisdom of more experience Butts.
Owner

Only just saw your other comments now:

I see another benefit now of dividing in two libraries that maybe you were already thinking about:

the low-level library doesn't need serde as a dependency so its lighter for someone to include if they end up making other choices about parsing

so by dividing the libraries we allow users to have smaller binaries if needed, even if the quick start user probably just wants functions that give them the info they need

Exactly :) We're on the same page! :) And the low-level library user can then decide if they want to use regex or serde and miniserde or some other approach entirely (without incurring any unnecessary compilation or binary costs).

Only just saw your other comments now: > I see another benefit now of dividing in two libraries that maybe you were already thinking about: > the low-level library doesn't need serde as a dependency so its lighter for someone to include if they end up making other choices about parsing > so by dividing the libraries we allow users to have smaller binaries if needed, even if the quick start user probably just wants functions that give them the info they need Exactly :) We're on the same page! :) And the low-level library user can then decide if they want to use regex or serde and miniserde or some other approach entirely (without incurring any unnecessary compilation or binary costs).
Author
Owner

I think mostly on the same page, but I still dont know that most libary users would be more likely to use go-sbotcli-rs than peach-sbot.

Which also makes me wonder if it should be called peach-sbot, if its still returning general data structures. Although not sure what else it could be called.

Maybe go-sbot-rs-bindings (low-level) and go-sbot-rs (high-level). Interestingly, the Sbot object will be instantiated in the same way for both I think.

I'm thinking about this API I used, https://github.com/frnsys/arena, a friend made for are.na. I've had various projects and scripts where I've wanted to work with are.na and I used his API. I was just happy it did the stuff I needed to do.

I think mostly on the same page, but I still dont know that most libary users would be more likely to use go-sbotcli-rs than peach-sbot. Which also makes me wonder if it should be called peach-sbot, if its still returning general data structures. Although not sure what else it could be called. Maybe go-sbot-rs-bindings (low-level) and go-sbot-rs (high-level). Interestingly, the Sbot object will be instantiated in the same way for both I think. I'm thinking about this API I used, https://github.com/frnsys/arena, a friend made for are.na. I've had various projects and scripts where I've wanted to work with are.na and I used his API. I was just happy it did the stuff I needed to do.
Author
Owner

one more potential downside of two libraries:

potentially will have to write some functions twice.

If peach-web instantiates an instance of Sbot from the high-level library,
then maybe it should not also make calls to the low-level library, as that would be confusing. So there might be some high-level library calls, which basically just call the low-level library call and return the result. But maybe thats just the extra-typing required in order to provide this extra flexible layer of abstraction.

one more potential downside of two libraries: potentially will have to write some functions twice. If peach-web instantiates an instance of Sbot from the high-level library, then maybe it should not also make calls to the low-level library, as that would be confusing. So there might be some high-level library calls, which basically just call the low-level library call and return the result. But maybe thats just the extra-typing required in order to provide this extra flexible layer of abstraction.
Author
Owner

ok here is a slight variant approach to consider:

https://legionlabs.com/blog/rust_ffi_finding/

found based on @soapdog post:
https://viewer.scuttlebot.io/%25lyYhUo6eYfX7nRqwrix%2BQGIDxJe3gBADtrFfEBrdj%2B8%3D.sha256

I am not 100% sure. But I believe if we just want functions that mirror the gosbotcli commands exactly (the low-level library), using the approach described in the blog post and by soapdog, then we can autogenerate this low-level-library, so that it exactly matches the cli (and we don't need to manually update it).

ok here is a slight variant approach to consider: https://legionlabs.com/blog/rust_ffi_finding/ found based on @soapdog post: https://viewer.scuttlebot.io/%25lyYhUo6eYfX7nRqwrix%2BQGIDxJe3gBADtrFfEBrdj%2B8%3D.sha256 I am not 100% sure. But I believe if we just want functions that mirror the gosbotcli commands exactly (the low-level library), using the approach described in the blog post and by soapdog, then we can autogenerate this low-level-library, so that it exactly matches the cli (and we don't need to manually update it).
Author
Owner
another blog post in similar vein: https://radu-matei.com/blog/from-go-to-rust-static-linking-ffi/
Author
Owner

another advantage of pushing forward on this approach is we would make it easier for people to also make these FFI bindings in other languages besides rust

another advantage of pushing forward on this approach is we would make it easier for people to also make these FFI bindings in other languages besides rust
Author
Owner
more documentation: https://doc.rust-lang.org/book/ch19-01-unsafe-rust.html#using-extern-functions-to-call-external-code
Author
Owner
https://github.com/alexcrichton/rust-ffi-examples
Author
Owner

posted on ssb too just to see if anyone has an opinion

posted on ssb too just to see if anyone has an opinion
Owner

I think mostly on the same page, but I still dont know that most libary users would be more likely to use go-sbotcli-rs than peach-sbot.

I agree; most users are more likely to use the higher-level library and we should make it as pleasant to use as possible. What I'm trying to convey is that there will be n > 0 users who only want to use the low-level library. We are in a position to cater to both groups with minimal extra work.

Which also makes me wonder if it should be called peach-sbot, if its still returning general data structures. Although not sure what else it could be called.

But it's not just returning general data structures, right? It seemed like your intention was to make it super convenient by grouping the results of various sbot calls into more complex structures. It could be called peach-sbot-query or peach-ssb-query to distinguish it from go-sbotcli-rs.

potentially will have to write some functions twice.

If peach-web instantiates an instance of Sbot from the high-level library,
then maybe it should not also make calls to the low-level library, as that would be confusing. So there might be some high-level library calls, which basically just call the low-level library call and return the result. But maybe thats just the extra-typing required in order to provide this extra flexible layer of abstraction.

I don't intend peach-web to call both the high-level and low-level library; it will only ever call the high-level library.

We will have to write two functions but they will not be the same. One function (low-level) calls sbotcli and returns the output; one function (high-level) calls go-sbotcli-rs and then parses the output. It's the exact same work being done, just split into two.


FFI Discussion:

I like the FFI approach and would choose that over the command call approach we are currently using. One of the main benefits - along with being useful for devs in multiple languages - is being able to bundle the static library into a Rust binary, meaning that we don't have to distribute the sbotcli binary with PeachCloud.

The problem is one of time and skill. Who can / will modify the Go code to be compatible with static compilation? How long will that take? Would we rather delay the release of PeachPub (might be many months) or work on the command approach in the meantime?

The output of the commands will be the same whether called from the sbotcli binary or FFI bindings, so at least our high-level library would be compatible with both.

> I think mostly on the same page, but I still dont know that most libary users would be more likely to use go-sbotcli-rs than peach-sbot. I agree; most users are more likely to use the higher-level library and we should make it as pleasant to use as possible. What I'm trying to convey is that there will be n > 0 users who only want to use the low-level library. We are in a position to cater to both groups with minimal extra work. > Which also makes me wonder if it should be called peach-sbot, if its still returning general data structures. Although not sure what else it could be called. But it's not _just_ returning general data structures, right? It seemed like your intention was to make it super convenient by grouping the results of various sbot calls into more complex structures. It could be called `peach-sbot-query` or `peach-ssb-query` to distinguish it from `go-sbotcli-rs`. > potentially will have to write some functions twice. > If peach-web instantiates an instance of Sbot from the high-level library, then maybe it should not also make calls to the low-level library, as that would be confusing. So there might be some high-level library calls, which basically just call the low-level library call and return the result. But maybe thats just the extra-typing required in order to provide this extra flexible layer of abstraction. I don't intend `peach-web` to call both the high-level and low-level library; it will only ever call the high-level library. We will have to write two functions but they will not be the same. One function (low-level) calls `sbotcli` and returns the output; one function (high-level) calls `go-sbotcli-rs` and then parses the output. It's the exact same work being done, just split into two. ----- **FFI Discussion:** I like the FFI approach and would choose that over the command call approach we are currently using. One of the main benefits - along with being useful for devs in multiple languages - is being able to bundle the static library into a Rust binary, meaning that we don't have to distribute the `sbotcli` binary with PeachCloud. The problem is one of time and skill. Who can / will modify the Go code to be compatible with static compilation? How long will that take? Would we rather delay the release of PeachPub (might be many months) or work on the command approach in the meantime? The output of the commands will be the same whether called from the `sbotcli` binary or FFI bindings, so at least our high-level library would be compatible with both.
Author
Owner

It's the exact same work being done, just split into two.

I think there may be cases where the low-level calls already return the strings we want, or we don't need to parse the result, in which case the high-level call will just call the low-level call without additional processing.

E.g. something like blob_add,
where we don't need to parse the result into a struct or do any regex matching. So there would be some duplication here (a blob_add function in the high-level library which just calls the low-level one).

So I think its non-zero extra work, but still think this could be an acceptable tradeoff.

> It's the exact same work being done, just split into two. I think there may be cases where the low-level calls already return the strings we want, or we don't need to parse the result, in which case the high-level call will just call the low-level call without additional processing. E.g. something like blob_add, where we don't need to parse the result into a struct or do any regex matching. So there would be some duplication here (a blob_add function in the high-level library which just calls the low-level one). So I think its non-zero extra work, but still think this could be an acceptable tradeoff.
Author
Owner

But it's not just returning general data structures, right? It seemed like your intention was to make it super convenient by grouping the results of various sbot calls into more complex structures. It could be called peach-sbot-query or peach-ssb-query to distinguish it from go-sbotcli-rs.

this is a very semantic discussion, as a lot of programming is lol.

Kind of hinges around the word 'general'. How general, from 0 (only peach) to UTF8 (9.5).

Perhaps the high-level libary would have functions as convenient as needed and define its own structs (like the are.na API I mentioned), but also be general enough to be usable by projects other than peachcloud, thus not having peach in the name.

Maybe, go-sbotcli-rs (low-level bindings) and go-sbot-api (a Rusty API for working with go-sbot) ?

> But it's not just returning general data structures, right? It seemed like your intention was to make it super convenient by grouping the results of various sbot calls into more complex structures. It could be called peach-sbot-query or peach-ssb-query to distinguish it from go-sbotcli-rs. this is a very semantic discussion, as a lot of programming is lol. Kind of hinges around the word 'general'. How general, from 0 (only peach) to UTF8 (9.5). Perhaps the high-level libary would have functions as convenient as needed and define its own structs (like the are.na API I mentioned), but also be general enough to be usable by projects other than peachcloud, thus not having peach in the name. Maybe, go-sbotcli-rs (low-level bindings) and go-sbot-api (a Rusty API for working with go-sbot) ?
Author
Owner

The problem is one of time and skill. Who can / will modify the Go code to be compatible with static compilation? How long will that take? Would we rather delay the release of PeachPub (might be many months) or work on the command approach in the meantime?

I can ask @cblgh as well how much time they think it would require. Agreed if a long time, could try making the command call way first, and substituting it later.

> The problem is one of time and skill. Who can / will modify the Go code to be compatible with static compilation? How long will that take? Would we rather delay the release of PeachPub (might be many months) or work on the command approach in the meantime? I can ask @cblgh as well how much time they think it would require. Agreed if a long time, could try making the command call way first, and substituting it later.
Owner

I think there may be cases where the low-level calls already return the strings we want, or we don't need to parse the result, in which case the high-level call will just call the low-level call without additional processing.

This is a case where re-exports come in handy. We can re-export low-level functions in our higher-level library, such that the consumer of the higher-level API has access and no code duplication is required. Functions (and types, for that matter) can be exported individually as needed or we could simply re-export the entire low-level library and layer our parsing and filtering logic on top. Then someone can use our higher-level library as their only dependency and still have the best of both worlds.

Re-exporting individual types or functions is generally discouraged when the dependency from which you're exporting is not under your control. In that case, it is better to re-export the entire crate. However, in our case we would control both libraries.

A real-world example of re-exports can be seen in serde. serde and serde_derive are separate libraries. serde re-exports serde_derive when the feature is selected. That means we can import serde as a single dependency, like this:

serde = { version = "1", features = ["derive"] }

And we can use the Deserialize and Serialize macros directly from serde:

use serde::{Deserialize, Serialize};

Maybe, go-sbotcli-rs (low-level bindings) and go-sbot-api (a Rusty API for working with go-sbot) ?

Sure.

> I think there may be cases where the low-level calls already return the strings we want, or we don't need to parse the result, in which case the high-level call will just call the low-level call without additional processing. This is a case where re-exports come in handy. We can re-export low-level functions in our higher-level library, such that the consumer of the higher-level API has access and no code duplication is required. Functions (and types, for that matter) can be exported individually as needed or we could simply re-export the entire low-level library and layer our parsing and filtering logic on top. Then someone can use our higher-level library as their only dependency and still have the best of both worlds. Re-exporting individual types or functions is generally discouraged when the dependency from which you're exporting is not under your control. In that case, it is better to re-export the entire crate. However, in our case we would control both libraries. A real-world example of re-exports can be seen in `serde`. `serde` and `serde_derive` are separate libraries. `serde` re-exports `serde_derive` when the feature is selected. That means we can import `serde` as a single dependency, like this: ```toml serde = { version = "1", features = ["derive"] } ``` And we can use the `Deserialize` and `Serialize` macros directly from `serde`: ```rust use serde::{Deserialize, Serialize}; ``` ----- > Maybe, go-sbotcli-rs (low-level bindings) and go-sbot-api (a Rusty API for working with go-sbot) ? Sure.
Author
Owner

Ah this is re-export thing sounds perfect for this case, thanks for explaining it.

Also had a good call with Alex which helped give more context on sbotcli, and also raised some questions. We were hoping to try to schedule a call with cryptix, to among other things figure out what type of interfaces he sees go-sbot long-term supporting, vs stuff that should be built outside. So there is still more to figure out about where filtering or querying should happen. Alex also thought it would be good to talk with cryptix about FFI, so maybe that call can shed more light on that too.

In the call with Alex we were able to walk through some concrete cases and try to figure out what would be the most go-sbot-idiomatic way to lookup certain values,
but Alex felt he needed some more time to research as well.

So to give an update on that call:

  • I learned some stuff about go-sbot from alex
  • alex will do some more thinking and research
  • a call with cryptix (and all of us) is probably needed soon
Ah this is re-export thing sounds perfect for this case, thanks for explaining it. Also had a good call with Alex which helped give more context on sbotcli, and also raised some questions. We were hoping to try to schedule a call with cryptix, to among other things figure out what type of interfaces he sees go-sbot long-term supporting, vs stuff that should be built outside. So there is still more to figure out about where filtering or querying should happen. Alex also thought it would be good to talk with cryptix about FFI, so maybe that call can shed more light on that too. In the call with Alex we were able to walk through some concrete cases and try to figure out what would be the most go-sbot-idiomatic way to lookup certain values, but Alex felt he needed some more time to research as well. So to give an update on that call: - I learned some stuff about go-sbot from alex - alex will do some more thinking and research - a call with cryptix (and all of us) is probably needed soon
Author
Owner

Re-exporting individual types or functions is generally discouraged when the dependency from which you're exporting is not under your control. In that case, it is better to re-export the entire crate. However, in our case we would control both libraries.

just writing this down here as a note to look into more later. Based on what you said, I wonder if go-sbot-api could be an additional Impl that gets implemented on the Sbot struct defined in go-sbot-rs. So if you are using go-sbot-api you can call the go-sbot-rs functions on the Sbot, and you can also call the additional go-sbot-api functions on the Sbot.

Like in this code example: https://play.rust-lang.org/?gist=d3c7bd9cb01767d517ed&version=stable

Although haven't looked into it deeply enough to see how this works where the Impl are in different crates.

> Re-exporting individual types or functions is generally discouraged when the dependency from which you're exporting is not under your control. In that case, it is better to re-export the entire crate. However, in our case we would control both libraries. just writing this down here as a note to look into more later. Based on what you said, I wonder if go-sbot-api could be an additional Impl that gets implemented on the Sbot struct defined in go-sbot-rs. So if you are using go-sbot-api you can call the go-sbot-rs functions on the Sbot, and you can also call the additional go-sbot-api functions on the Sbot. Like in this code example: https://play.rust-lang.org/?gist=d3c7bd9cb01767d517ed&version=stable Although haven't looked into it deeply enough to see how this works where the Impl are in different crates.
Owner

We can't implement our own methods for an external type.

You can only define an inherent implementation for a type in the same crate where the type was defined.

Instead, we can define a trait in our higher-level library and implement it for our lower-level struct:

lib.rs in low-level library (pretend it's called go_sbot_api)

pub struct SbotApi {
    pub path: String,
}

impl SbotApi {
    pub fn init(path: String) -> Self {
        SbotApi { path }
    }
}

lib.rs in high-level library

use go_sbot_api::SbotApi;

trait SbotQueries {
    fn report_path(&self) -> String;
}

impl SbotQueries for SbotApi {
    fn report_path(&self) -> String {
        // get the `path` value from our instance of `SbotApi`
        let path = &self.path;
        format!("sbotcli binary is located at {}", path)
    }
}

#[cfg(test)]
mod tests {
    use crate::SbotQueries;
    use go_sbot_api::SbotApi;

    #[test]
    fn report_path_test() {
        let path = "/home/glyph/sbotcli".to_string();
        let sbot = SbotApi::init(path);
        assert_eq!(
            sbot.report_path(),
            "sbotcli binary is located at /home/glyph/sbotcli".to_string()
        );
    }
}

This approach gives us all of the methods from the base SbotApi struct (provided that they've been made public) and allows us to add additional methods in our high-level library.

We can't implement our own methods for an external type. > You can only define an inherent implementation for a type in the same crate where the type was defined. - source: [Rust compiler error E0116](https://doc.rust-lang.org/error-index.html#E0116) Instead, we can define a trait in our higher-level library and implement it for our lower-level struct: **`lib.rs` in low-level library** (pretend it's called `go_sbot_api`) ```rust pub struct SbotApi { pub path: String, } impl SbotApi { pub fn init(path: String) -> Self { SbotApi { path } } } ``` **`lib.rs` in high-level library** ```rust use go_sbot_api::SbotApi; trait SbotQueries { fn report_path(&self) -> String; } impl SbotQueries for SbotApi { fn report_path(&self) -> String { // get the `path` value from our instance of `SbotApi` let path = &self.path; format!("sbotcli binary is located at {}", path) } } #[cfg(test)] mod tests { use crate::SbotQueries; use go_sbot_api::SbotApi; #[test] fn report_path_test() { let path = "/home/glyph/sbotcli".to_string(); let sbot = SbotApi::init(path); assert_eq!( sbot.report_path(), "sbotcli binary is located at /home/glyph/sbotcli".to_string() ); } } ``` This approach gives us all of the methods from the base `SbotApi` `struct` (provided that they've been made public) and allows us to add additional methods in our high-level library.
This pull request is marked as a work in progress.
This branch is out-of-date with the base branch
You can also view command line instructions.

Step 1:

From your project repository, check out a new branch and test the changes.
git checkout -b parse-json main
git pull origin parse-json

Step 2:

Merge the changes and update on Gitea.
git checkout main
git merge --no-ff parse-json
git push origin main
Sign in to join this conversation.
No description provided.