• 0 Posts
  • 127 Comments
Joined 1 year ago
cake
Cake day: July 1st, 2023

help-circle













  • Depends on where you live.

    The bank will often send an inspector for a loan, but it’s literally him just walking around and validating there is a house and it’s not in shambles. He’ll look at things like the roof from the outside and when it was redone, but isn’t going to hop into your crawlspace to look for signs of water damage.

    Then you have the “private” inspection company that you can pay to check your home for yourself. These companies are know to cost a lot of money, often detailing things they can’t be sure are “risks”. They’ll go in the crawlspace and note all sorts of things.

    On my house the expensive private inspection said “the roof here is kinda saggin and there’s a bump there, it could be anything”. In the same report he accidentally shows a picture from under the roof where you can see there was a repair and some extra framing, causing the small “bump” that is purely aesthetic. Didn’t mention that part.

    Getting someone to look at it post purchase is likely going to be much cheaper, and I’m definitely not recommending people don’t get inspections when buying houses if they don’t know what they’re doing.




  • RedditWanderer@lemmy.worldtoFunny@sh.itjust.worksThat's not troubling at all
    link
    fedilink
    arrow-up
    16
    arrow-down
    5
    ·
    edit-2
    6 months ago

    It’s not weird because of that. The bot could have easily explained it can’t answer legally, it didn’t need to say: sorry gotta end this k bye

    This is probably a trigger on preventing it from mixing in laws of AI or something, but people would expect it can discuss these things instead of shutting down so it doesn’t get played. Saying the AI acted as a lawyer is a pretty weak argument to blame copilot.

    Edit: no idea who is downvoting this but this isn’t controversial. This is specifically why you can inject prompts into data fed into any GPT and why they are very careful with how they structure information in the model to make rules. Right now copilot will give technically legal advice with a disclaimer, there’s no reason it wouldn’t do that only on that question if it was about legal advice or laws.