There is a website, PHP index file. There is a database, there is a content table, well, of course, there are articles.
If index php?=6, it shows article id=6 and so on. Well, there are 4-5 other pages, like about us, write to us, etc.
At the moment, the number of articles is about 400, but they say that if there are more than 1000 articles (well, a lot), then the site will “fail”.
How true is this?
Answer 1, authority 100%
This is a lie. =)
1000 articles for MySQL even on a weak hosting is a simple task even if full-text search is used with might and main.
Problems can begin if a crowd of hamsters comes to the site. But even a business card can collapse there.
Nonsense, the speed of receiving will depend on the characteristics of the server, if the id field in the table is set as PRIMARY KEY with the smallint type and length 5, then there will be no problems with receiving. In this case, the id can be a little more than 65000. If you want more, then the field type is mediumint and the length is 8, then up to 16 million
I remember a long time ago when I started to learn php and sql, for some reason I also thought that the speed of work depends very much on this … Well, in general, with a terrible for loop, I threw 2 million records into the articles table, it practically did not affect the sampling speed 🙂< br>
PS: especially since you have a selection by inductus, and when I did it then I still had little idea what it was, well, in general, they were not there except for the primary id, and the selection was made by title…
1000 articles, if you don’t pull everything at once, if the server has at least 512 RAM, there will be no problems, make a selection by ID as already mentioned and indicate only the necessary lines, well, if everything is completely bad, or the project is designed for a heavy load.. you can alternatively index the main affected fields, for faster access to them (during indexing, the database will take up a little more space on the hard drive)