When I search this topic online, I always find either wrong information or advertising lies. So what is actually something that LLMs can do very well, as in being actually useful and not just outputing a nonsensical word salad that sounds coherent.
When I search this topic online, I always find either wrong information or advertising lies. So what is actually something that LLMs can do very well, as in being actually useful and not just outputing a nonsensical word salad that sounds coherent.
While this is something LLMs are decent at, I feel this is only of value if your notes are unstructured, and it presents infosec concerns.
I guess my notes are unstructured, as in they’re what I type as I’m in the meeting. I’m a “more is better” sort of note taker, so it’s definitely faster to let AI pull things out.
Infosec … I guess people will have to evaluate that for themselves. Certainly, for my use case there’s no concern.