In general, parsing is when you take a large chunk of data and break it down into smaller, more useful chunks.
When a compiler or interpreter is turning the source code of a programming language into executable code, it must first parse that source code so it knows what statements the program is trying to use. It can then use that information to translate your source into computer-understandable machine code.
In computer science and linguistics, parsing, or, more formally, syntactic analysis, is the process of analyzing a text, made of a sequence of tokens (for example, words), to determine its grammatical structure with respect to a given (more or less) formal grammar. Parsing can also be used as a linguistic term, for instance when discussing how phrases are divided up in garden path sentences.
Parsing is also an earlier term for the diagramming of sentences of natural languages, and is still used for the diagramming of inflected languages, such as the Romance languages or Latin. The term parsing comes from Latin pars (ōrātiōnis), meaning part (of speech).
justin smythhe wrote:Can someone give a simple explanation to begin with ?
Please tell me which one is a good/the best definition:
Campbell Ritchie wrote:Parsing generally means taking text apart and deciding what the relationships between successive tokens are. It is a grammatical term, and can be used of any language, natural, arithmetic, logic, computer languages, etc. The parseInt() method does something very similar, converting the String to an int.
justin smythhe wrote:What are theses "tokens" ? Can you give me a (slightly bigger) example to illustrate your explanation. If possible, please show me something that needs to be parsed,
why in needs to be parsed and what is the result after it has been parsed.
Because those parsers are using different grammars.
Stephan van Hulst wrote: . . . an arithmetical expression by one parser, and can be seen as a Los Angeles phone number by another.