Wrote a very fast URL shortener

Artemix on January 29, 2018

Yesterday, I came home and, before going to sleep and half-asleep, I decided to code something. What ? I had no idea at the moment I did so. tl... [Read Full]
 

Nice work. Some feedback:

Using the file system is going to degrade in performance very quickly. It's a nice idea for a URL shortening system that only you use (< 100 URLs), but file I/O is going to be MUCH slower than simply using a database. You also miss out on data compression that you get for free with all database storage engines.

Most importantly, it also limits you quite a lot if you ever wanted to enhance your program. Let me give you an example.

Right now you are not caching URLs that already exist. This should be fixed because there is no point storing the same URL with different codes. To fix this, you will need to look for the given URL in each file. This is a linear operation, but the sheer amount of IO required to open every file, read it's contents, check for equality and then close the file is going to be a total killer for your app if you have millions of URLs!

 

Nice!

The way you wrote it lets you easily plugin different "storage backends", you might then benchmark locally the difference between, say, a File Storage backend and a Redis backend or a DBMS backend.

It could be interesting to find out which is faster for this use case ;-)

 

If I'm stuck again inside a boredom trip, I may do it !

 

I am curious. Why would file io be any faster than database?

I would be more convinced if you use something like memcache.

 

The reason for that is that I'm hosting it on a SSD and the only DBMS I currently have at hand is stored on another server, with imposed TLS auth (PgSQL configured for security).

Also, like I answered to another one, I really didn't care about speed when I made it.

When I've been able to test, and I imagine it to be expected as I was the only one using it, I really had blazingly-fast results.

We'll see in the long run how it handles !

(Also, using a PHP caching extension may be a good idea).

 

Running a database on your current server probably is the best option. I wondered that because you state "very fast" in your title. And when I read on, you uee file IO. And it contradicts so much that I can't stop wonder.

Sadly, it's currently on a small server I have no root control on, and which doesn't have a lot of storage space.

Still, good point

 

Nice idea!
I always find it interesting to try and implement things we use daily without thinking too much about them.
Although to be fair...for urls < 40 characters your tool makes them longer ;)

 

It's mainly because I have a big domain :p

I've been searching for a smaller domain for some time but it's not really a priority.

I mainly developed it out of boredom, I don't expect it to be used "for production", even if that would be a nice idea.

code of conduct - report abuse