• ITGuyLevi@programming.dev
    link
    fedilink
    English
    arrow-up
    4
    ·
    18 hours ago

    Is it invisible to accessibility options as well? Like if I need a computer to tell me what the assignment is, will it tell me to do the thing that will make you think I cheated?

    • Sauerkraut@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      4
      ·
      18 hours ago

      Disability accomodation requests are sent to the professor at the beginning of each semester so he would know which students use accessibility tools

        • A_Chilean_Cyborg@feddit.cl
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 hour ago

          Probably postpone? Or start late paperwork to get acreditated?, talk with the teacher and explain what happened?

        • TachyonTele@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          8 hours ago

          You’re giving kids these days far too much credit. They don’t even understand what folders are.

          • Sas [she/her]@beehaw.org
            link
            fedilink
            English
            arrow-up
            2
            ·
            3 hours ago

            What a load of condescending shit. You’re giving kids not enough credit. Just because folders haven’t been relevant to them some kids don’t know about them, big deal. If they became in some way relevant they could learn about them. If you asked a millennial that never really used a computer they’d probably also not know. I’m fairly sure that people with disabilities know how to use accessibility tools like screen readers.

  • Queen HawlSera@lemm.ee
    link
    fedilink
    English
    arrow-up
    3
    ·
    22 hours ago

    I wish more teachers and academics would do this, because I"m seeing too many cases of “That one student I pegged as not so bright because my class is in the morning and they’re a night person, has just turned in competent work. They’ve gotta be using ChatGPT, time to report them for plagurism. So glad that we expell more cheaters than ever!” and similar stories.

    Even heard of a guy who proved he wasn’t cheating, but was still reported anyway simply because the teacher didn’t want to look “foolish” for making the accusation in the first place.

  • technocrit@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    1
    ·
    edit-2
    17 hours ago

    This is invisible on paper but readable if uploaded to chatGPT.

    This sounds fake. It seems like only the most careless students wouldn’t notice this “hidden” prompt or the quote from the dog.

    Maybe if homework can be done by statistics, then it’s not worth doing.

    Maybe if a “teacher” has to trick their students in order to enforce pointless manual labor, then it’s not worth doing.

    Schools are not about education but about privilege, filtering, indoctrination, control, etc.

    • Goodman@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      1
      ·
      13 hours ago

      It does feel like some teachers are a bit unimaginative in their method of assessment. If you have to write multiple opinion pieces, essays or portfolios every single week it becomes difficult not to reach for a chatbot. I don’t agree with your last point on indoctrination, but that is something that I would like to see changed.

    • ArchRecord@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      15 hours ago

      Schools are not about education but about privilege, filtering, indoctrination, control, etc.

      Many people attending school, primarily higher education like college, are privileged because education costs money, and those with more money are often more privileged. That does not mean school itself is about privilege, it means people with privilege can afford to attend it more easily. Of course, grants, scholarships, and savings still exist, and help many people afford education.

      “Filtering” doesn’t exactly provide enough context to make sense in this argument.

      Indoctrination, if we go by the definition that defines it as teaching someone to accept a doctrine uncritically, is the opposite of what most educational institutions teach. If you understood how much effort goes into teaching critical thought as a skill to be used within and outside of education, you’d likely see how this doesn’t make much sense. Furthermore, the heavily diverse range of beliefs, people, and viewpoints on campuses often provides a more well-rounded, diverse understanding of the world, and of the people’s views within it, than a non-educational background can.

      “Control” is just another fearmongering word. What control, exactly? How is it being applied?

      Maybe if a “teacher” has to trick their students in order to enforce pointless manual labor, then it’s not worth doing.

      They’re not tricking students, they’re tricking LLMs that students are using to get out of doing the work required of them to get a degree. The entire point of a degree is to signify that you understand the skills and topics required for a particular field. If you don’t want to actually get the knowledge signified by the degree, then you can put “I use ChatGPT and it does just as good” on your resume, and see if employers value that the same.

      Maybe if homework can be done by statistics, then it’s not worth doing.

      All math homework can be done by a calculator. All the writing courses I did throughout elementary and middle school would have likely graded me higher if I’d used a modern LLM. All the history assignment’s questions could have been answered with access to Wikipedia.

      But if I’d done that, I wouldn’t know math, I would know no history, and I wouldn’t be able to properly write any long-form content.

      Even when technology exists that can replace functions the human brain can do, we don’t just sacrifice all attempts to use the knowledge ourselves because this machine can do it better, because without that, we would be limiting our future potential.

      This sounds fake. It seems like only the most careless students wouldn’t notice this “hidden” prompt or the quote from the dog.

      The prompt is likely colored the same as the page to make it visually invisible to the human eye upon first inspection.

      And I’m sorry to say, but often times, the students who are the most careless, unwilling to even check work, and simply incapable of doing work themselves, are usually the same ones who use ChatGPT, and don’t even proofread the output.

    • TheRealKuni@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      17 hours ago

      Maybe if homework can be done by statistics, then it’s not worth doing.

      Lots of homework can be done by computers in many ways. That’s not the point. Teachers don’t have students write papers to edify the teacher or to bring new insights into the world, they do it to teach students how to research, combine concepts, organize their thoughts, weed out misinformation, and generate new ideas from other concepts.

      These are lessons worth learning regardless of whether ChatGPT can write a paper.

    • thebestaquaman@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      17 hours ago

      The whole “maybe if the homework can be done by a machine then its not worth doing” thing is such a gross misunderstanding. Students need to learn how the simple things work in order to be able to learn the more complex things later on. If you want people that are capable of solving problems the machine can’t do, you first have to teach them the things the machine can in fact do.

      In practice, compute analytical derivatives or do mildly complicated addition by hand. We have automatic differentiation and computers for those things. But I having learned how to do those things has been absolutely critical for me to build the foundation I needed in order to be able to solve complex problems that an AI is far from being able to solve.

  • Navarian@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 day ago

    For those that didn’t see the rest of this tweet, Frankie Hawkes is in fact a dog. A pretty cute dog, for what it’s worth.

  • Etterra@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 day ago

    Ah yes, pollute the prompt. Nice. Reminds me of how artists are starting to embed data and metadata in their pieces that fuck up AI training data.

      • TachyonTele@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 hours ago

        That’s interesting. Are there examples of this? I’m assuming they’re little one off dead end streets or similar.

  • Schtefanz@feddit.org
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    1
    ·
    20 hours ago

    Shouldn’t be the question why students used chatgpt in the first place?

    chatgpt is just a tool it isn’t cheating.

    So maybe the author should ask himself what can be done to improve his course that students are most likely to use other tools.

    • Zron@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      20 hours ago

      ChatGPT is a tool that is used for cheating.

      The point of writing papers for school is to evaluate a person’s ability to convey information in writing.

      If you’re using a tool to generate large parts of the paper, the teacher is no longer evaluating you, they’re evaluating chatGPT. That’s dishonest in the student’s part, and circumventing the whole point of the assignment.

      • technocrit@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        edit-2
        17 hours ago

        The point of writing papers for school is to evaluate a person’s ability to convey information in writing.

        Computers are a fundamental part of that process in modern times.

        If you’re using a tool to generate large parts of the paper

        Like spell check? Or grammar check?

        … the teacher is no longer evaluating you, in an artificial context

        circumventing the whole point of the assignment.

        Assuming the point is how well someone conveys information, then wouldn’t many people better be better at conveying info by using machines as much as reasonable? Why should they be punished for this? Or forced to pretend that they’re not using machines their whole lives?

        • ArchRecord@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          15 hours ago

          Computers are a fundamental part of that process in modern times.

          If you were taking a test to assess how much weight you could lift, and you got a robot to lift 2,000 lbs for you, saying you should pass for lifting 2000 lbs would be stupid. The argument wouldn’t make sense. Why? Because the same exact logic applies. The test is to assess you, not the machine.

          Just because computers exist, can do things, and are available to you, doesn’t mean that anything to assess your capabilities can now just assess the best available technology instead of you.

          Like spell check? Or grammar check?

          Spell/Grammar check doesn’t generate large parts of a paper, it refines what you already wrote, by simply rephrasing or fixing typos. If I write a paragraph of text and run it through spell & grammar check, the most you’d get is a paper without spelling errors, and maybe a couple different phrases used to link some words together.

          If I asked an LLM to write a paragraph of text about a particular topic, even if I gave it some references of what I knew, I’d likely get a paper written entirely differently from my original mental picture of it, that might include more or less information than I’d intended, with different turns of phrase than I’d use, and no cohesion with whatever I might generate later in a different session with the LLM.

          These are not even remotely comparable.

          Assuming the point is how well someone conveys information, then wouldn’t many people better be better at conveying info by using machines as much as reasonable? Why should they be punished for this? Or forced to pretend that they’re not using machines their whole lives?

          This is an interesting question, but I think it mistakes a replacement for a tool on a fundamental level.

          I use LLMs from time to time to better explain a concept to myself, or to get ideas for how to rephrase some text I’m writing. But if I used the LLM all the time, for all my work, then me being there is sort of pointless.

          Because, the thing is, most LLMs aren’t used in a way that conveys info you already know. They primarily operate by simply regurgitating existing information (rather, associations between words) within their model weights. You don’t easily draw out any new insights, perspectives, or content, from something that doesn’t have the capability to do so.

          On top of that, let’s use a simple analogy. Let’s say I’m in charge of calculating the math required for a rocket launch. I designate all the work to an automated calculator, which does all the work for me. I don’t know math, since I’ve used a calculator for all math all my life, but the calculator should know.

          I am incapable of ever checking, proofreading, or even conceptualizing the output.

          If asked about the calculations, I can provide no answer. If they don’t work out, I have no clue why. And if I ever want to compute something more complicated than the calculator can, I can’t, because I don’t even know what the calculator does. I have to then learn everything it knows, before I can exceed its capabilities.

          We’ve always used technology to augment human capabilities, but replacing them often just means we can’t progress as easily in the long-term.

          Short-term, sure, these papers could be written and replaced by an LLM. Long-term, nobody knows how to write papers. If nobody knows how to properly convey information, where does an LLM get its training data on modern information? How do you properly explain to it what you want? How do you proofread the output?

          If you entirely replace human work with that of a machine, you also lose the ability to truly understand, check, and build upon the very thing that replaced you.

    • fibojoly@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      17 hours ago

      Sounds like something ChatGPT would write : perfectly sensible English, yet the underlying logic makes no sense.

      • fishbone@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        2
        ·
        8 hours ago

        The implication I gathered from the comment was that if students are resorting to using chatgpt to cheat, then maybe the teacher should try a different approach to how they teach.

        I’ve had plenty of awful teachers who try to railroad students as much as possible, and that made for an abysmal learning environment, so people would cheat to get through it easier. And instead of making fundamental changes to their teaching approach, teachers would just double down by trying to stop cheating rather than reflect on why it’s happening in the first place.

        Dunno if this is the case for the teacher mentioned in the original post, but the response is the vibe I got from the comment you replied to, and for what it’s worth, I fully agree. Spending time and effort on catching cheaters doesn’t help there be less cheaters, nor does it help people like the class more or learn better. Focusing on getting students enjoyment and engagement does reduce cheating though.

      • Randomgal@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        16 hours ago

        Lemmy has seen a lot like that lately. Specially in these “charged” topics.

    • PM_Your_Nudes_Please@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      18 hours ago

      It’s the same argument as the one used against emulators. The actual emulator may not be illegal, but they are overwhelmingly used to violate the law by the end user.

    • abbadon420@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      20 hours ago

      But that’s fine than. That shows that you at least know enough about the topic to realise that those topics should not belong there. Otherwise you could proofread and see nothing wrong with the references

    • xantoxis@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 day ago

      Is it? If ChatGPT wrote your paper, why would citations of the work of Frankie Hawkes raise any red flags unless you happened to see this specific tweet? You’d just see ChatGPT filled in some research by someone you hadn’t heard of. Whatever, turn it in. Proofreading anything you turn in is obviously a good idea, but it’s not going to reveal that you fell into a trap here.

      If you went so far as to learn who Frankie Hawkes is supposed to be, you’d probably find out he’s irrelevant to this course of study and doesn’t have any citeable works on the subject. But then, if you were doing that work, you aren’t using ChatGPT in the first place. And that goes well beyond “proofreading”.

  • Lamps@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 days ago

    Just takes one student with a screen reader to get screwed over lol

      • Khanzarate@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 days ago

        Right, but the whitespace between instructions wasn’t whitespace at all but white text on white background instructions to poison the copy-paste.

        Also the people who are using chatGPT to write the whole paper are probably not double-checking the pasted prompt. Some will, sure, but this isnt supposed to find all of them its supposed to catch some with a basically-0% false positive rate.

        • Scrubbles@poptalk.scrubbles.tech
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          1 day ago

          Yeah knocking out 99% of cheaters honestly is a pretty good strategy.

          And for students, if you’re reading through the prompt that carefully to see if it was poisoned, why not just put that same effort into actually doing the assignment?

          • Windex007@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 days ago

            Maybe I’m misunderstanding your point, so forgive me, but I expect carefully reading the prompt is still orders of magnitude less effort than actually writing a paper?

            • Scrubbles@poptalk.scrubbles.tech
              link
              fedilink
              English
              arrow-up
              0
              ·
              1 day ago

              Eh, putting more than minimal effort into cheating seems to defeat the point to me. Even if it takes 10x less time, you wasted 1x or that to get one passing grade, for one assignment that you’ll probably need for a test later anyway. Just spend the time and so the assignment.

              • where_am_i@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                0
                ·
                1 day ago

                Disagree. I coded up a matrix inverter that provided a step-by-step solution, so I don’t have to invert them myself by hand. It was considerably more effort than the mind-boggling task of doing the assignment itself. Additionally, at least half of the satisfaction came from the simple fact of sticking it to the damn system.

                My brain ain’t doing any of your dumb assignments, but neither am I getting a less than an A. Ha.

                • Scrubbles@poptalk.scrubbles.tech
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  edit-2
                  1 day ago

                  Lol if this was a programming assignment, then I can 100% say that you are setting yourself up for failure, but hey you do you. I’m 15 years out of college right now, and I’m currently interviewing for software gigs. Programs like those homework assignments are your interviews, hate to tell you, but you’ll be expected to recall those algorithms, from memory, without assistance, live, and put it on paper/whiteboard within 60 minutes - and then defend that you got it right. (And no, ChatGPT isn’t allowed. Oh sure you can use it at work, I do it all the time, but not in your interviews)

                  But hey, you got it all figured out, so I’m sure not learning the material now won’t hurt you later and interviewers won’t catch on. I mean, I’ve said no to people who I caught cheating in my interviews, but I’m sure it won’t happen to you.

                  For reference, literally just this week one of my questions was to first build an adjacency matrix and then come up with a solution for finding all of the disjointed groups within that matrix and then returning those in a sorted list from largest to smallest. I had 60 minutes to do it and I was graded on how much I completed, if it compiled, edge cases, run time, and space required. (again, you do not get ChatGPT, most of the time you don’t get a full IDE - if you’re lucky you get Intellisense or syntax highlighting. Sometimes it may be you alone writing on a whiteboard)

                  Of course that’s just one interview, that’s just the tech screen. Most companies will then move you onto a loop (or what everyone lovingly calls ‘the Guantlet’) which is 4 1 hour interviews in a single day, all exactly like that.

                  And just so you know, I was a C student, I was terrible in academia - but literally no one checks after school. They don’t need to, you’ll be proving it in your interviews. But hey, what do I know, I’m just some guy on the internet. Have fun with your As. (And btw, as for sticking it to the system, you are paying them for an education - of which you aren’t even getting. So, who’s screwing the system really?)

                  (If other devs are here, I just created a new post here: https://lemmy.world/post/21307394. I’d love to hear your horror stories too, as in sure our student here would love to read them)

  • ryven@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 day ago

    My college workflow was to copy the prompt and then “paste without formatting” in Word and leave that copy of the prompt at the top while I worked, I would absolutely have fallen for this. :P

      • BatmanAoD@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 day ago

        Wot? They didn’t say they cheated, they said they kept a copy of the prompt at the top of their document while working.

        • finitebanjo@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          1 day ago

          Any use of an LLM in understanding any subject or create any medium, be it papers or artwork, results in intellectual failure, as far as I’m concerned. Imagine if this were a doctor or engineer relying on hallucinated information, people could die.

          • juliebean@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            17 hours ago

            they didn’t say they used any kind of LLM though? they literally just kept a copy of the assignment (in plain text) to reference. did you use an LLM to try to understand their comment? lol

            • finitebanjo@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              arrow-down
              1
              ·
              13 hours ago

              Its possible by “prompt” they were referring to assignment instructions, but that’s pretty pointless to copy and paste in the first place and very poor choice of words if so especially in a discussion about ChatGPT.

          • AWildMimicAppears@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            1 day ago

            there is no LLM involved in ryven’s comment:

            • open assignment
            • select text
            • copy text
            • create text-i-will-turn-in.doc
            • paste text without formatting
            • work in this document, scrolling up to look at the assignment again
            • fall for the “trap” and search like an idiot for anything relevant to assignment + frankie hawkes, since no formatting

            i hope noone is dependent on your reading comprehension mate, or i’ll have some bad news

            • finitebanjo@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              arrow-down
              1
              ·
              edit-2
              13 hours ago

              lmao fuck off, why put so much effort into defending the bullshit machines?

              EDIT: I honestly didnt even read your comment, too much time wasted arguing with bota and techbros, but if you mean to try to explain the user meant copying the assignment instructions then said user should never have used the word “prompt” in this context to begin with.

              • BatmanAoD@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                4 hours ago

                Holy shit, “prompt” is not primarily an AI word. I get not reading an entire article or essay before commenting, but maybe you should read an entire couple of sentences before making a complete ass of yourself for multiple comments in a row. If you can’t manage that, just say nothing! It’s that easy!

              • stevegiblets@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                24 hours ago

                I feel nothing but pity for how stupid you are acting right now. Read it all again and see if you can work it out.

                • finitebanjo@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  arrow-down
                  1
                  ·
                  18 hours ago

                  How dare I hurt your feelings by standing up for academic honesty and responsibility. How dare I oppose automating paperwork meant to prove competence of students who will decide the fates of other people in their profession.

                  Just despicable, absolutely attrocious behavior.