 Many of the people in the audience have published at least one scientific publication. Can you raise your hand? OK. So you are all with me responsible for what I'm going to discuss today. Well, this talk, first of all, I would prefer to give it in front of a beer or a glass of wine or so just in a relaxing atmosphere. It's not a scientific talk. But the idea of discussing the topics started years ago when, well, this time really in front of a glass of beer or glass of wine with colleague of my age, we started to realize that the way of doing science has changed quite deeply from the time we started till now. And at a good point, I got tired to discuss the things. And I decided to look into the problem. And I discovered there is a lot of material which is published, a lot of debates, blogs, articles, books. And so I tried to give a dimension to this concept. And I wrote a book, the book which is in Italian. It was in Italian, so the title is in Latin. The book has been translated by Oxford University Press. And so they didn't like Latin. And so we changed the title. And the title is the title of the talks, The Overproduction of True. And the subtitle is basically Passion, Competition, and Integrity in Contemporary Science or Modern Science. Well, let me start from the time I did my PhD. So I was your age. I was in Berlin in 1980s, early 80s. And these are the instruments that we are using at the time. I don't think many people in the audience know what is this object here. This is a rotating head of an IBM electrical typewriter. For me, it was a miracle of technology because I could leave my mechanical typewriter, which was quite slow and not very efficient. At that time, we were still using carbon copy to produce copies of our papers. And we were corresponding with journals and other colleagues writing letters. Well, it looks long time ago, but this is when I started. And this is my dissertation, in which I wrote it entirely with this typewriter. So when I was in Berlin, I was used to go to the library almost every day. And I'm a physical chemist. And I was trying to read the literature. In particular, Journal of Physical Chemistry was my favorite. When I was there in 1980, one issue of Journal of Physical Chemistry contained about 30 papers per issue. And it was published twice, altogether 26 issues, about 800 papers, less than 4,000 pages in one year. If you go back in 1960, so 20 years before, Journal of Physical Chemistry was publishing one monthly issue, about 20 papers per issue, 2,000 pages overall in the entire year. Well, if you go now today, the same journal, Journal of Physical Chemistry, has been divided into A, B, C, and letters. It publishes something like 6,000 papers in a year and more than 60,000 pages. And this is just one journal. So you immediately want Journal of Physical Chemistry. So there are now many more journals in physical chemistry. This already gives you an idea of how things have changed in the time life of my career. This is another interesting picture, which goes even farther away in time. This is a famous Soviet conference in 1927, organized in Brussels. There are 27 people in the picture. And out of the 27, 17, they got the Nobel Prize. And the other 12, they didn't got the Nobel Prize. Their names is now in textbooks. So 29 people, small audience, like today, but extremely high concentration of genius. Now, if you go now today to a normal conference in less than physics, well, American Physical Society organizes four conferences per year. And the average attendance is 10,000 participants. Chemistry is similar. We go up to 18,000 participants per conference. But if you go to medical sciences, you can go up to 50,000 participants on a single conference. This already gives you the dimensions of science how this has changed over time. Well, the first person that started to try to give a dimension to this problem was a sociologist in 1960s, Derek DeSolla Price. He tried to do a very complex work at the time to figure out how many scientific papers have been published since 1650 until 1960, so in about three centuries. And he realized that there were about 2 million scientific articles which have been produced in these 300 years. This is the original paper. And if you look at this logarithmic scale, it shows the growth, which was already an exponential growth in the last century of the papers in chemistry, biology, physics, and so on and so forth. Now, if you look at this number here, it is 1910. 1910, so the beginning of last century. You realize that until 1910, the total amount of scientific papers was just a few thousand. Well, you can say, well, nothing really happened in the past before in the 19th century or before. But that is not true. If you go and you look what happened in the 19th century, the amount of science which has been produced is impressive and some very extremely important discoveries have been done. I mean, terbodynamics, electrodynamics, electromagnetism, x-rays, and evolution is everything is concentrated in just a few thousand papers. Well, this already tells us that things have changed quite deeply. This is a paper which has been published in science last year where it shows something that already the solar prize realized in 1960. So this is the exponential growth of the publications we are producing at an exponential rate. However, the people, the authors of this study, they try to do something more subtle. They try to find out, do are we producing also new ideas at an exponential rate? And what they did, they tried to looking at the keywords and, I mean, particular centers and so on. They find out that actually the topics which are discussed in these papers are growing linearly. And actually, they are growing linearly, which is the blue line if you are optimistic. Because if you do look at the data, we are almost leveling off. Which means we are producing a lot of papers, but not necessarily we are producing new concepts or new ideas at the same speed, at the same pace as we produce papers. Well, certainly what is true that we are producing now every year more than 2 billion papers, scientific papers. And by scientific papers, I mean papers by referee, we appear review, every year. So every year we produce the same amount of knowledge which has been produced in the 300 years from 1650 to 1960. Well, there are many reasons for this. Of course, one reason, sometimes are good reasons. One reason is that, for instance, the number of authors on a paper is growing. So we are collaborating more. And since we are collaborating more, of course, collaborations help producing more studies. So this is the number of papers with a given number of authors. And the number of studies with a single author, which was a normal thing in the 19th century, is decreasing. Some fields still have a single paper, a single author on a paper. But look at this. This is another phenomenon, which is quite recent. This is the number of authors or the number of papers with a given number of authors. For instance, 50 authors on a paper you see is steadily growing 100, 200, 500,000 authors on a single paper. There was no single paper with more than 1,000 authors until 2009. Now we have papers which have more than 1,000 co-authors. Well, do you know what is the largest number of authors on a single paper? Can you make a guess? How many? So yes, exactly. How many authors? You have no idea? I tell you. 5,954. This is a paper publishing the very prestigious physical radio letters two years ago. It's one of the Atlas experiments. While if you go and you will find that the paper has seven pages, text, and 24 pages, authors list. And another number of pages for the institutions, because there are 344 institutions. Of course, on one side, it is extremely impressive that you can coordinate 5,000 people for a single objective, for a single target, which is basically the experiment. But of course, it also raises the question, what is the individual contribution of this specific author to this specific paper? How can we compare a paper which has a single author on it and a paper which has 5,000 authors? Well, these are all interesting questions. But this is something interesting that I found in an editorial in 1968 in a journal which is called Advances in Catalysis, where editor wrote an interesting editorial, because he said already in 1968, the amount of papers which are published is growing exponentially. There are a number you can estimate, but it was quite clear already. So the number of papers published in every field is growing exponentially. But the time that I can dedicate to read papers, that is not growing exponentially. That is more or less a constant. So I can dedicate 2, 3, 5 hours a day, no more. So finally, the result is that the amount of information that I can get is the ratio between how many papers I can read, which is a constant, and how many papers are produced over time, which is an exponential. And this goes to zero. And this is extremely worrying, but that is true. The amount of material information which is produced is so large that what we can get is proportionately smaller and smaller amount. OK, that, of course, the growing number of papers is accompanied by a growing number of scientific journals. The number of scientific journals is growing exponentially. And we have something like 30,000 about scientific journals which are active in the world nowadays. And this opens a very interesting and important topic which is related to quality. Now let me spend a few words about quality. What is quality? Well, what you see here, these are essentially two liters of a solution of ethanol in water, more or less the same amount, 12%, 15%. However, you can go in a supermarket and you can buy this one for one year, or you have to go in a specialized shop and buy this one for 400 years. Why is that? Because the quality is not the same. And of course, we are ready to pay so much money, maybe I'm not ready to pay that money, but OK, somebody maybe is ready to pay because the quality is high. Now the same criterion applies to scientific publications. Now let me open the topic of open access, which is a very popular topic nowadays because open data, open access, open science, these are all very important concepts which have been introduced with the internet era essentially. And it's a very noble concept because the idea is that every paper is made immediately accessible to everybody. There is no copyright because the author is paying for the publication cost. At the moment, the paper is accepted. You know how it works, I guess. So you basically, because of course, in publishing a paper, there is a cost. This is an important concept. So the cost is paid by the author. There is no copyright. The paper is immediately made free. The number of open access journals is increasing and there is also a strong push to open new open access journals. And of course, from the purely economical point of view, it can also be a great business. I take here two serious open access journals, which is one is Scientific Reports and the other is Plus One. These two journals are quite popular. And for instance, Scientific Reports publish 25,000 papers in 2017, steady growing, Plus One publish something like 30,000 papers in 2015. Now the average cost of an open access article is about $1,500, $2,000 per paper. It is very easy to compute how much is the income. You see, 30 million, 45 millions is a huge amount of money. However, first of all, I'm looking only at these two journals, the number of submissions is now decreasing. These are data from 2018. It was a peak. And now, for some reason, people start stopping submitting to these journals. Why is that? Because the quality is not considered as high as it was considered originally. And this opens the key question of open access quality. There are many very serious, very good open access journals around available. But there are also a lot of what are called predatory journals. What are predatory journals? Are journals which are created only to make money and not to provide a service to the community. Now this phenomenon was highlighted by a very interesting study which was published by John Bohannon in Science in 2013. John Bohannon is a freelance biologist. And what he did was very interesting. He wrote a fake paper full of mistakes, full of errors. A paper where he describes a drug which can cure cancer and from a non-existing institution and an invented auto, completely fake. He generated something like 300 slightly different copies of the same paper. And he submitted to the paper to about 304 different open access journals. Well, after some time, half of those journals, 157 to be precise, accepted the paper without any problem. And the paper contains so deep error that every first year student should have been able to identify these errors. That was a clear sign the paper had not been sent out for any review. Of course, the bill was sent to the author once the paper was accepted. Please pay. Only 98 out of 304 open access journals, only 98, rejected the paper based on a report. And the difference, there are about 50 journals that didn't reply, didn't react. Simply were disappeared from the time of the submission and the time the paper of Boano was published in Science. This article, which became famous, has shown the very dangerous aspect of open access journals which are only created to make money. While Boano did also something else, he started to try to find out where are these journals located. He thought they were located only in third world countries from Africa or Asia. Actually, that's not true. A lot of these journals are based in the United States. So it is very interesting, but also very worrying. Now, I think everybody is getting this kind of maze. I got this maze every day, many a day. This I found it particularly hilarious. So I reporting here the letter, the mail. Here's a Franco-Pacchione, greetings and good day. I represent the editorial office of PsychoMed Publishing, PsychoMed. We came across your recent article, Thanks and Exciting Catalysis and Photocatalysis, hints from DFT. Publishing Topics of Catalysis, we feel that the topic of this article is very interesting, I agree. Therefore, we are delighted to invite you to join the editorial board of Fire Journal entitled Inside Civil Engineering. So the publisher of PsychoMed, The Jordanous Civil Engineering, I publish a paper in the tier of catalysis. Invite me to join the editorial board. Well, that is just one of the many examples and this you know better than me because you're probably getting similar things. So this is a very big problem. This guy, Jeffrey Bill, is a librarian and years ago, you know, in 2008, he started to make a list of predatory open access journals, journals which are not reliable. And he went up to a list of thousands of these journals. Then he had to close down this website and because his job became in danger. Because his university were under strong push from these publishers. They said, we are going to go to court. We're going to pursue you if you don't stop publishing the names of the journals which are predatory journals. And so he had closed, this website is no longer existing because it was going to be fired. But he wrote something which I think is completely right. The act of instituting a financial transaction between scholarly out or so, between us and publishers, the editor of the publishers, is corrupting scholarly communication. This was one of the great benefits of a traditionally publishing system. There is no monetary component in the relationship between the authors and the publishers. This is a very delicate aspect. We never, I never paid to have an article published. And I would never do before my career is finished. I mean, somebody has to pay, the system has to pay, but I don't want to have a direct bargain between me and the publisher. Because this introduces a bias. The publisher can publish whatever if you pay enough. That is a very dangerous, a delicate issue. We may discuss that later. Of course, with predatory journals, there is another growing phenomenon which is predatory conferences. Well, one day I decided to instead of cancel them as soon as they arrived to collect them. So I collected 24 hours. These are the dates, how many invitations I got. And I got 16 invitations on a single day, on 24 hours. So everything, biology and chemistry and the polymers. I never worked on polymers. First global transitions, conference, and all these kinds of things. I don't know what it is, but of course, you can pay. You can go there and you can put on your CV that you have been invited to talk in these conferences. OK, so this already is the first part of my talk which shows that science has changed completely. From the 19th century view, which was a few passionate people with a strong motivation and strong also insight and knowledge to the modern science, which is a large market because it has become a market where we are all part of this market. So the question I tried to answer first was not easy. How many people are forming the market of science today? And that was not an easy. I mean, it took me some time to find out. Finally, I came out with a number and we're using different estimates. And the different estimates were pointing to the same numbers. So we passed 10 million active researchers in the world. By this, I mean not only academic. I also mean not only scientists, also engineers, but people working in science to produce knowledge in science. That already also means that more than 90% of the scientists of all times are living today, are living presently. Another exercise I did was this was also not so easy. I tried to figure out how many scientists were active, for instance, in 1930. Well, in 1930, I came with a bizarre estimate. We were in the world, in the total world, something like 200,000 scientists. We were 2 billion people. So there was one researcher every 10,000 inhabitants. Then we had 1,000 scientists in 1960, 1 million scientists in 1960, 5 million in 2000. This is a real number. This is an exact number. 5,000 in 2000, 5 million scientists. There were 6 billion people. Already the ratio was 1 every 1,200. Now we are 10 million, and we are 7 billion people. So 1 researcher every 700. If we extrapolate now the growth rate of scientists and people, in the middle of century, we will have 35 million scientists and 9 billion people. So 1 researcher every 250 inhabitants, which it's a lot. And one day there will only be scientists in the world and we will continue that way. That would be good, but I don't think we will get to that point. Well, I mean, of course, there are many reasons for this, also good reasons. But there is one very important consequence. So the growth of scientists until now is exponential. But the growth of funding, world funding, in research and development is linear. So there is an increase in funding research, but not exponential. And we notice that very clearly because the competition for funding has become extremely tight, tough, difficult. So it is extremely difficult to get funded nowadays. And everybody is spending time writing applications to get funded because the success rate is decreasing. Obviously. Well, the dimensions of science with millions of actors also poses another question. How do we evaluate science as a global entity? Now, if you ask yourself, how people evaluate the success of a TV program? Well, in Italy, at least, they measure audience and share. That means how many people are watching that program. It doesn't say anything about the quality of the program. The program can be very high quality, but very low number of spectators. It's considered irrelevant. How to measure the success of a scientist in a market-driven environment? Well, you have to measure papers and citations. And right now, many papers and many citations is what characterizes a successful scientist. Well, again, you may think is something new. First of all, the citation, what is the citation? We know it's basically when in a paper we cite another paper from a colleague, it actually doesn't measure the validity of that paper. It measures more how useful that paper is. If I publish a paper which provides something which everybody uses, then it is highly cited. But it is imprecise, but it's robust because we will come to new indicators which are even more delicate. It's robust. It has plastic cons, but it's something that is very well-defined. Now, it was introduced long time ago because the person who invented the idea to use the citation as a quality measure was Erging Garfield, which recently died and is the father of a modern biometric and symptomatic science. He invented the web of science. He invented, you don't know it, current contents, other tools in the impact factor. Everything has been invented by this guy. But this was in 1955. So a long time ago, when introduced, he introduced this idea of the citation. Well, when I was your age, there was no web of science. There was no internet. Actually, there was a web of science, but not on the web. There was the citation index was on paper and was a huge number of volumes. Every year, it was published with different colors. And you were going. And the blue color was, I don't know, 1988. And we were going to look like a big telephone book and look at your name and finding how many papers that cited your names or paper. Can imagine how much work was behind that. It's interesting. But that was before the internet era. Now, we all know what this basically means. What are these tools? This also led to another quality indicator, which is the impact factor. As usual, things are always cross-concept, good size and bad size. You know what is the impact factor? It measures the average number of citations that the papers are receiving in a given journal. So it is a quality measure of a journal, not of a paper. It is an average. And of course, it can be very high in some fields, in some journals, it can go down. It's going to go down to very low values. And of course, if it is used improperly, it can also become dangerous or delicate. It's a negative. It has been created to evaluate the impact of the magazine, of the journal, not individuals. Nowadays, in some fields, in some countries, in some places, it is used to evaluate individuals, which is absolutely crazy. So it has a lot of problems because it stimulates the idea that you must maximize your impact through citations, which means work in fields where many people are working because you get more citations if you're working in a field where many actors are present. Escape from the research which are dangerous in terms of citation because if you work in a new field which nobody's working on it, nobody's going to cite you. And that is, of course, creating a lot of problems that were not existing 40 years ago. But of course, there is some truth. Why peep everybody was to publish in high impact factor journals? I think that is quite natural. Why is that? Well, this is the only example that I know where there is a kind of proof of the reason why we want to publish in high impact factor journals. In 2007, the same editorial, same text, has been published in eight different medical journals, exactly the same text. And a few years later, people went back in suit to see how many citations this article has received. And the number of citations received correlates exactly with the impact factor of the journal. So the journal which had high gain impact factor, the paper got more citations. The journal with the lowest impact factor got almost no citations. But it's exactly the same text. Well, that is a fact. Of course, we also have to say, and this is a sentence, not my statement, but somebody was said and I agree, that finally, the impact factor measures the quality of the journal. And more or less, it reflects the reputation that we have for the journal. I am a chemist. I mean, the most important journals in chemistry are Jacks and Angevante Kemi. And these have the highest impact factor in chemistry. So in some way, there is some correlation between impact factor and reputation and quality. Of course, well, this was the index to evaluate people has been invented. It was a few years ago, 15 years ago. It's an age index. I don't have to explain what it is. I think everybody knows. Of course, it doesn't work for young guys. You need it works. And it grows steadily with age. So it has. But what is interesting is that we are today or tomorrow and after tomorrow here to discuss how to use solar energy and produce new fuels and new energy. Well, there are scientists who are just studying how to measure science. So there are communities of people who are studying how to evaluate numerically other scientists. There are their own journals, which are called Scientometric or Journal Scientometric Research. There are various indices. This book lists more than 400 different indices to evaluate scientific research. However, there is a very famous economic law, the Goddard's law, which says, when a measure becomes a target, it ceases to be a good measure. Because people are trying to maximize that specific index. And that specific index may not be any longer a good measure of what you want to measure. However, the evolution of times may be even worse. And when I say this, I always say, well, when this becomes the way to evaluate people in science, I want to stop. It's called altmetric. Have you heard about altmetric? Altmetric is the new way of evaluating the impact of a scientific paper. It's not based on the citation. The citation is another scientist which cites a colleague. This is based on how much a paper is resonating on the socials, how many times the paper is downloaded, how many likes I got you on Twitter, how many times it is cited in Facebook, or whatever. So it is based, essentially, on the web. And we know how easy it is to manipulate these kind of things. And that is, of course, dangerous. But if you go every journal nowadays, you can open every journal. You can find citations and altmetrics near every paper. OK, so let me come to the third and last and most important part of my talk. Because the growth of the dimensions in science are also carrying out a number of ethical problems, which are listed here. Bad practices which were almost unknown 40 years ago, but are now growing in terms of number or percentage. I will go through them quite quickly. Plagiarism, falsification of data, fraud, duplication of papers, irreproducible results, wrong or irrelevant results. Well, for every of these topics, there are very serious studies behind them. Let me give you, however, a first, very important message. Science is a very objective activity. It's still the most objective human activity. Because we don't trust something just because somebody is saying. We want to check. We want to verify. We want to prove things are in a given way. So objectivity is the basis of science. And it is also the basis of authority of science in the society. This is a very important message and a very important value. We have to defend this, in particular, in an era of fake news and circulation of a number of very strange notions. So science communication is a paradigm which integrates nature, people, theories, and so on. We have to defend this kind of very important concept. Why I'm saying that? Because the back practices are growing in importance. Plagiarism, OK, there is a lot of stuff about plagiarism. That is not really a big problem. Now every paper is scanned automatically to see if there are portions of text which are duplicated. And there are studies showing that, I mean, the stupid plagiarism is not a very important phenomenon. So we are talking 2%, 3% of the total number of paper published. But the clever plagiarism, what does it mean? If you change the text, if you change a little bit the figure, and so on, that is very difficult to spot. That is extremely difficult to spot. Interesting enough, plagiarism is not considered a negative or one of the most negative back practices. Why? Because you just replicate something but you don't distort reality. It's not that you say something wrong. You're just saying twice something that is already been said by another person. OK, falsification of data. This is something which is mostly analyzed in terms of interviews and statistical questions. Apparently, there are 2% of people who admit that at least once in their life have falsified data. But a larger number of people admit that at least once in life has forgotten a given data or just didn't give enough importance to another set of data or just try to adapt the reality of what you find to what is the working hypothesis, which is a very common and very natural attitude. We want to demonstrate something. We forget or we say, OK, we discovered this point because maybe the experiment was not clean enough and so on. That is a significant number. It's very difficult to put exact values, but there is something. But OK, among bad practices, this is something which is completely new and it has been spotted recently. And believe me, it's almost unbelievable, but it's true. This is a journal which is called Humor Biology, which is from Springer, which doesn't exist anymore. It has been closed. Why it has been closed? Because they have to retract 107 papers. And why they have to retract 107 papers? Well, you know that when you submit an article to a journal, most of the time you are suggested to name some potential referees. So you are asked to put the names of some possible referees. Well, in this case, the authors were giving names of referees, which were nonexisting. They created false website addresses and false CVs so that the editors were picking up these false referees, sending the papers to the referees, but the paper were going back to the authors. Of course, the evaluation was extremely positive. That was in clinical research. And the papers, when they discovered that, they have to close down the journal and retract 107 papers. This was mostly in China. In China, government has taken serious actions against that. And I'm citing this only to show you the level of ethical behavior, where people can go, just in order to get the paper published. Well, frauds are very interesting topic, which we have no time to discuss. But actually, frowning science is completely stupid, because since science is based on the idea of verifying the results, if you say something which is not true soon or later, it will be found out. So if you try to basically behave improperly and you invent results sooner or later, if the result is important, sooner or later, somebody is going to repeat the experiment. Well, however, there are two kinds of frauds. This is a very, very famous fraud. I don't know how many in the audience have heard the name of Jan-Andrik Schoen, probably not the youngest. Schoen was a guy of your age, was just after a PhD. Brilliant. He graduated in the University of Constance. And he went to Bell Laboratories in the 1998, 1999. Very competitive environment. In the years where everybody was expecting the miraculous discovery of molecular electronics. Everybody was expecting the new transistor based on a single molecule connecting to. Then he started to produce a series of miraculous results. And he published his results in very important journals like Science and Nature, and blah, blah, blah. For three years, he produced impressive data. The story is reported. I have one chapter in my book, which is Summarize the Story. But I took it from this very nice book, which gives a full account of the story. After three years, people were not able to reproduce his results. And finally, they discovered that everything has been invented. And you know I was discovered that he was basically manipulating results. Because one author found out that he had in two different pictures, from two different experiments in two different journals, the same background noise, exactly the same background noise, which is impossible, statistically impossible. And from that we started. So he was basically manipulating the data to produce results that at that time were expected by the communities, where a community was believing. Now, this fraud was basically solved and concluded in three years. The damage was limited to the community of molecular electronics. There are frauds, however, which have a much stronger impact on society. And this is the case. This guy is Andrew Wakefield. I don't know if you heard of him. He was a medical doctor. I said he was, because he's still alive, but he's no longer a medical doctor. He has been radiated from the Medical Doctor Association in UK. In 1998, he published an article in the Lancet, the most prestigious medical journals, saying that there is a risk, or more than risk, that the vaccine induced autism in individuals and other diseases. This was later shown to be completely fake, but it took more than 10 years, because you need a long time in order to destroy a clinical study. A clinical study needs many, many trials and attempts before you can prove or disprove it. The news that vaccines induce outings spread very rapidly around the world. And there are countries, like in Italy, where there are still communities of people who are convinced and will never change their mind that vaccines are dangerous and they induce autism. And that was due to a fraud by this guy, which has been, as I said, radiated. But this, the damage you can still find on the web, people believe what he says. Well, this leads me to another topic, which is error irreproducibility. Again, this is mostly in clinical studies, but there are an increasing number of studies which are non-reproducible. Remember that to reproduce a scientific paper, you need a lot of investment with no gain, because if you are lucky, you reproduce the result, and there is no glory for that. If you are unlucky, you find that you don't reproduce the result, you may wonder what they did wrong and so on. So it's a lot of work for nothing. Our two companies in pharmaceutical studies, they tried to reproduce a number of studies and the success of the reproduction was extremely low, 10%, 20%. Because these studies are done with poor statistics, poor methods, poor data, and so on and so forth. And of course there is a huge cost in the medical research, so this is a very important warning. More and more, I found when being a referee, papers which contain basic mistakes, which we should not see in a scientific paper. This is a study which was published 10 years ago by this guy, John Scott, who is an expert of ferroelectrics, and he was tired to see people claiming ferroelectric behavior of materials. This is the classical hysteresis curve when you have a ferroelectric material. And you will find papers published based on curves like these ones. And what he did, he took a banana, he applied it in an electric field, and he said, look, this is the diagram of a banana under an electric field. This is not a ferroelectric, it's a banana. So please don't publish if your curve looks like that. And he lists a number of studies which are wrong, simply because people cannot distinguish between these two kind of curves. Well, but this leads to probably the most delicate issue, which is papers which are not wrong, which are not fraudulent, are simply irrelevant. And here it's very difficult to provide real good estimates. But this is one study which is quite robust but is already quite old. Essentially it is based on citations. And what this study shows that 27% of papers account for about 80% of the total citations worldwide. So most of the citations are coming from one form of papers. And there are a large number of papers which I've never cited, which doesn't mean are not good papers, are certainly not so useful. I mean, that is according to the concept of citation. This is a problem because if you are producing too much papers which are not relevant or useful or important or whatever, there are two issues. One is loss of credibility. So we are basically producing papers, not producing knowledge. And the second is that the taxpayer one day could ask, what are you doing with the money that we are giving you? The answer is we are producing papers that the taxpayer in Italy they don't care but in the US it can be very aggressive because they go to jail if they don't pay taxes and so it becomes much more sensible. Why I'm telling you these stories? Because science is extremely important for the society. We know of the stories of pseudo science and fake news. Now in these days in Italy we are debating there is a congress on flat earth and I put this a little time ago because in the US the flat earth association is an association. It looks like a normal society. They have a conference every year. They have sponsors, speakers, sessions, and the big website. So it looks indistinguishable from any scientific conference but they discuss the fact that flat is earth. Sorry, earth is flat. This is another reason why I say it's very dangerous what is going on, on science. This is an open-asset journal which they published a few years ago, a paper, where the authors say that cold fly ash, what comes out from the engines of airplanes are done on purpose to spread chemicals in the atmosphere to change the climate. And this has been retracted because it's complete. If you read the paper it is ridiculous. How can the paper, who the paper get to publish is quite surprising but the paper has been retracted because there's no foundation but you go to a website and you still see there are many sites say, look, there is a scientific paper saying that you are right. The cold fly ashes are done on purpose by governments. So this provides support to the people who circulate this fake news. I'm at the end of my talk. I don't think many people in the audience have ever heard of this guy, Robert Merton. Nobody. Well, Robert Merton is like a rick-o-ferme for a physicist. So for a sociologist is the most important sociologist of science in the 19th century. He formulated some ethical principle of science which I'm going to summarize. Four ethical principle of science which are still valid. We are all using them without knowing it. The first is universalism. Everybody can contribute to science without distinction of race, nationality, culture, religion, of gender. This is very important, where science has always been open in that respect. And this place is actually a demonstration of what I'm saying. The second very important pillar of ethical science is science is not a private matter but is a shared property. We have to share results. We have to share, we have to discuss. We want to know what you're doing and do it together. The third is quite naive, if you look at it today, which is called disinterest. But there is a very important actually message. Merton said researchers must find their motivation in the pleasure of discovering new things, not in their careers, winning awards or funding. Well, we all want to have awards, better career and better funding. But a scientist must add another motivation. He must be driven by curiosity. He must be driven by the desire to innovate. They have awards, the funding is a consequence, not the target. And many of the bad practices I've shown you is just because people are reversing the order. We just put the career before the motivation, the passion. And the last is what we have mentioned already. Merton said, science is organized skepticism. Scientific community must always verify the truthfulness and solidity of each statement before off accepting it as a truth. This is really the most important pillar of science. Now I think we need to rediscover these fundamental ethical principles. And what can we do in practice? And this is my last slide. Not easy, not easy. Because when you are in a market-driven system, what is not easy to change? Here are just some very simple ideas. I think we have to, as we are doing today, I think to open a debate, open a discussion, and also ask that everybody, in particular the new students and so on, or when you get a grant application, you have signed or shown that you have passed some kind of courses or some kind of education basically about research integrity. Because research integrity is becoming a problem. So we have to introduce these courses at PhD levels at least. We have also probably to shift from quantity to quality. We have to ask for fewer papers, but of higher quality of higher impact. In evaluating people, we probably, we can also have a look at the overall production, but we have to concentrate just on what is really as becoming important. It's just driven a change in the field. So require that investigators submit only the top papers. And this may be difficult, but it's also, I think, in a given point it could become useful. We need to know that we are publishing journals which are robust, journals which are reliable, journals that provide a good service to the community. When you go to a bank, you put your money to the bank, but bank is certified that is basically doing good business, is not doing dirty business. We need something like that also for journals in order to discriminate between really purely commercial enterprises and journals which are providing a service. That's all what I have wanted to say. So thank you for listening, and of course we are ready to discuss. So thank you very much for this inspiring, and I mean sometimes tough also to listen. And now it's time for questions or comments or, so little, yeah. Hi, I really enjoyed your talk, thank you. You gave really compelling arguments against open access, and I was wondering, I mean we're all here because of a grant that came from the EU and the EU is mandating open access. How do you think we can avoid that, change it? Yeah, excellent question. I think we want to have a transition toward open access, but we want to be smooth and without counter consequences. I think the only way to go toward open access is that publishers and the institutions or nations are negotiating at a national level, a centralized level, how to proceed. We cannot leave single authors paying single fees to the publishers. So in some countries, this kind of regulation and agreement have been reached, recently also it's a year which is one of the most difficult publishers, I think in Norway or Sweden, Holland. So there must be a transition. Because remember, the way it was working in the past was very high quality control procedure because the librarian was saying, okay, we have a journal, maybe a new journal. How many of you is interested in this journal? Three people said yes, the librarian said no, I don't want to subscribe, it's not in. 30 people said yes, that maybe is interesting. This was a very clear quality criteria. The university were only subscribed to journals which were a high reputation because the community wanted to read the journals. Open access has this problem that I've shown you and which everybody knows and this clearly has been created because it's very easy. You open a website and you send out automatic maize and you can basically get, even if you get 100 papers in a year and you get 1000 for a year, for paper, your salary for a year is done, okay. On the contrary, a serious publisher has a very high cost. White has a very high cost. Let me make this point very clear. Assume you are sitting at the desk of, I don't know, the best journal you think, the Lancet, Nature, you get 1000 of paper every day. How do you select? You need people who read and try to interact with authors and discard what is not good. This costs time. If you do it quickly, well, you can do it quickly. Then you have to assign the referees. If you want to have the best experts in the world, it takes a huge amount of time. Yeah, yeah. I mean, since there is a cost in the publication process, which is what I was describing, the cost will be a part of the cost of science, no question about that. So this remains, somebody has to pay. Now the question is, exactly, open access will make all the knowledge available to companies for free. Nowadays, companies or private enterprises are contributed about 25% to the overall cost of publications because if Google want to know what we are doing, they have to pay the subscription. In the future, we open access, we'll have nothing to pay because everything will be for free. Well, I mean, tax payers, this is part of the, I mean, publication cost is part of the scientific cost. So it's a, exactly, you pay salaries, you pay consumables, you pay laboratories and they have publication cost. So the tax payer has to pay one way or the other. So it enters into the cost of science. You cannot remove it, it is a part of it. If you want to remove it, then there is a, in my view, it's incorrect economic analysis from the beginning. If you want to pay tax payers not to be paying for the publication cost, that is impossible. Okay. Just to complete the question of open access because there is a very strong push to open access. What do you think about Plan S? We're talking, sorry? I am. What do you think about Plan S? I think it's a, okay, let me stay soft. If it is done in order to accelerate the transition, okay. If it is done, because I really want to implement it, it is a disaster. But let me finish because this will destroy, for instance, the academic societies. So it's in order to do something which is, in principle, it's very ideological. This has to be done in a smooth transition. And I'm very strong in favor of Europe, but this is a typical top-down procedure that sometimes Europe takes. And then in this case, there could be counter-consequences. Is it loud or you want to take open access? Okay.