The same way they are now. We already have machines which have simple emergent behavior which was not planned by the programmer.
Feppe, You may be watching or reading too much science fiction.
Part of the problem is language. We think with language, but language sometimes can't quite deal with reality. For example: English uses the word "hot" for something that can burn your hand and the same word for a spice that feels as if it's burning your mouth. Spanish makes a distinction between "caliente" for something that can burn your hand and "picante" for something that burns your mouth. Thai makes the same distinction with the words (rendered phonetically in English) "lawn" and "pet," respectively. When I say "hot" in English I can't be understood precisely unless I add modifiers to go with the word. The same thing's true of the English word, "love," as CS Lewis pointed out in detail. Incidentally, since your profile tells me you live in Netherlands I assume English isn't your first language. I'm very impressed with your grasp of it. I wish I could do that with a few languages.
For some time we've had machines that are said to "learn." That seems to be the word we use for that situation, though "memorize" would be closer to the truth. What the machine actually does is store data it's programmed to accept and store. This is not "learning," nor is it "thinking." The programmer did the thinking.
We also can have a machine that correlates the data it's "learned" (stored) and comes to a conclusion about the correlation. That might be called "thinking" except for one thing: it's the programmer who did the thinking. The machine arrives at its conclusion as a result of an "if, then, else" sequence designed by the programmer. In what's come to be called "artificial intelligence" (an oxymoron) the "if, then, else" sequence may be a lot more complicated than I've made it sound, but that's what happens.
Finally, we can have a machine that engages in what you called "emergent behavior which was not planned by the programmer." That's true, but only in the sense that the programmer set up a series of alternatives one or more of which the machine selects. That process can get pretty complicated too, and appear to be something it's not. It's correct to say that the machine selected an alternative that was "not planned by the programmer," but the machine's world was designed by the programmer and the machine can't step outside its world.
HAL was an interesting character in "2001," but he was just that -- a character.
I'm afraid we've gotten awfully far away from the original question about "editioning" photographic prints.