• 0 Posts
  • 2 Comments
Joined 2 years ago
cake
Cake day: July 22nd, 2023

help-circle
  • To give the (anthropomorphized) model credit, I think the biggest problem is that the way it’s trained is has practically no concept between the relation of the token it spits out an the html code that will be produced by the ChatGPT UI wrapper.

    Because of that inserting regular newlines often don’t work because a single newline in markdown doesn’t translate to a line break in HTML.

    To force a particular structure I often ask it to use some will known formats like yaml. Because there is so much yaml training data, is practically impossible for the LLM to not add newlines at the correct places (which is also typically rendered correctly because it places that in a markdown code block)