Cache all 80,443 traffic on ubuntu server

Cerrado Publicado hace 3 años Pagado a la entrega
Cerrado Pagado a la entrega

My Ubuntu server(application on server) very often request to specific webpage on port 80,443 and download the same files few times a day.

I need to speed up this requests so I think that good idea i set cache server for example squid and push all my traffic(80,443) from local network through this cache sarver. Then frequently requested pages or files keep on disc locally in order to prevent future request to remote webpage.

I need someone who will implement this solution and wrote how it is done. I will give access to my testing VPS and I will move myself the same config to my production.

Linux Squid Cache Varnish Cache Ubuntu

Nº del proyecto: #29464788

Sobre el proyecto

7 propuestas Proyecto remoto Activo hace 3 años

7 freelancers están ofertando un promedio de $42 por este trabajo

tanujchugh

Hi How are you! I am system admin. I will check the issues of your ubuntu server and resolve the issues as per your requirement. I have expertise in the relevant field of server management. I have 14 years of experi Más

$40 USD en 1 día
(319 comentarios)
6.9
sachinkumar19

Hi, I'm Red Hat certified system administrator. I have worked on multiple Linux projects. And I would like to work on your project. Please initiate the chat to talk about the project in detail. Thanks & Regards, S Más

$100 USD en 7 días
(21 comentarios)
4.3
Deepak904121

I am a Linux admin have 7 years of experience in this field. I worked in webhosting industry and manage datacenter from remote. I give full server management support and I can do this for you . I also have experience w Más

$20 USD en 1 día
(5 comentarios)
2.8
raksanthony

I am a Linux admin have 7 years of experience in this field. I worked in webhosting industry and manage datacenter from remote. I give full server management support and I can do this for you . I also have experience w Más

$25 USD en 1 día
(1 comentario)
1.0
dat30

Hello! You probably want to balance your requests, insted of local caching. If the problem relies on the time spend on process the requests, then you can cache the answers, but if the problematic requests, are static Más

$56 USD en 1 día
(1 comentario)
0.9
omkhard09

Hi , I am Om I think you should , - Try putting a load balancer before your server so that it doesn't only divides the workload onto a server but then it also speed the traffic onto the server and then to the client.

$20 USD en 7 días
(0 comentarios)
0.0