Search Results

Search found 17852 results on 715 pages for 'load balancer'.

Page 420/715 | < Previous Page | 416 417 418 419 420 421 422 423 424 425 426 427  | Next Page >

  • PDF to Image Conversion in Java

    - by Geertjan
    In the past, I created a NetBeans plugin for loading images as slides into NetBeans IDE. That means you had to manually create an image from each slide first. So, this time, I took it a step further. You can choose a PDF file, which is then automatically converted to an image for each page, each of which is presented as a node that can be clicked to open the slide in the main window. As you can see, the remaining problem is font rendering. Currently I'm using PDFBox. Any alternatives that render font better? This is the createKeys method of the child factory, ideally it would be replaced by code from some other library that handles font rendering better: @Override protected boolean createKeys(List<ImageObject> list) { mylist = new ArrayList<ImageObject>(); try { if (file != null) { ProgressHandle handle = ProgressHandleFactory.createHandle( "Creating images from " + file.getPath()); handle.start(); PDDocument document = PDDocument.load(file); List<PDPage> pages = document.getDocumentCatalog().getAllPages(); for (int i = 0; i < pages.size(); i++) { PDPage pDPage = pages.get(i); mylist.add(new ImageObject(pDPage.convertToImage(), i)); } handle.finish(); } list.addAll(mylist); } catch (IOException ex) { Exceptions.printStackTrace(ex); } return true; } The import statements from PDFBox are as follows: import org.apache.pdfbox.pdmodel.PDDocument; import org.apache.pdfbox.pdmodel.PDPage;

    Read the article

  • How to embed woff fonts for iframe source pages?

    - by Mon
    I am trying to make and embed a slideshow (of text, photo, audio, video, etc) in my site (HTML5) by loading webpages consecutively inside an iframe embedded in my first page. Most or all of the frame source pages, i.e. pages loaded inside first iframe are mine, but located in many different places. All of these pages are in an Indic language. Although I can use UTF-8 charset and lang="" declaration and that's probably enough functionally, but I also want to embed my preferred Indic unicode font in WOFF format via CSS3 @font-face rule , so the size and look of the text is uniform and the way I want it - throughout the slideshow. Problem is, there are many many pages in the slideshow all located in various places with many more linked pages, and it is next to impossible, or at least would be extremely tedious, to embed my custom WOFF font in every single page (which will also require separate css and uploading of fonts in every single instance). Besides, this may make the slideshow very heavy, sluggish and cumbersome for the user, since it will have to load the custom Indic font again and again everytime a new page is loaded in the iframe. I am not sure about this though. Is that how it works? I ask this, because I noticed that when I embedded my custom WOFF font in the 'first' page, it did not have any effect on the pages loaded inside the iframe. If I embed the font in some of the pages in the iframe, the next pages still don't get my font. Is there a way to embed my custom WOFF font only once, preferably in the first page where the first iframe is, and pass its effect on to all the pages embedded / loaded through the iframe and make their text show up as per my initially embedded woff font - without embedding my font in every single of them? Please help!

    Read the article

  • OpenVPN and Squid Setup troubleshooting

    - by Adam
    I am trying to setup squid to tunnel via openvpn so that I can just enter an Ip and port in my browser settings and use it as a US proxy. My server is a OpenVZ VM. Running into some issues: I setup openvpn using : http://safesrv.net/install-openvpn-on-centos/ as part of that guide I also ran: iptables -t nat -A POSTROUTING -o venet0 -j SNAT --to-source iptables -t nat -A POSTROUTING -s 10.8.0.0/24 -j SNAT --to-source Installed squid using this guide: http://www.server-world.info/en/note?os=CentOS_6&p=squid from that guide changed acl lan src 10.0.0.0/24 to acl lan src 10.8.0.0/24 Next, I went to my browser proxy settings and put - 10.8.0.1 in the HTTP field. Put the port I had setup in the squid config file and tried to load a page. Nothing connecting. Any help? What am I doing wrong?

    Read the article

  • Cannot find GRUB - Ubuntu/Windows 8 dual-boot

    - by ubeatlenine
    Hello Ubuntu community, I find myself in an interesting situation. I have a Dell Inspiron 531 with Windows Vista. Recently my brother decided it would be a good idea to overwrite Vista with the Windows 8 consumer preview. Since we have had this PC for a very long time, we have long since lost the Vista CD, and according to the Windows 8 preview website you cannot recover your previous OS without it. I thought this would be a good opportunity to try out Ubuntu (since we obviously cannot keep the preview as an OS), but it appears that Ubuntu 11.10 Desktop is not compatible with Win8. Ubuntu doesn't run from the LiveUSB I made, instead it freezes on the loading screen and then disintegrates into black and white stripes. I blamed this failure on Ubuntu not being compatible with win8 yet and tried to install Ubuntu from the USB on a partition made from the remaining space on my hard drive - about 100GB. However the installer crashed while loading modules and told me I didn't have enough disk space. Since then, I have not been able to load either Ubuntu or Windows, BIOS is shifted over to the left of my screen, and I always get the same message: error: unknown filesystem grub rescue> typing "ls" at the prompt gives me the following: (hd0) (hd0,msdos7) (hd0,msdos6) (hd0,msdos5) (hd0,msdos2) (hd0,msdos1) does this mean I have multiple partitions running windows on my computer? Is it possible to recover Vista without the disk? Are all of my problems stemming from Ubuntu not being compatible with Win8 preview? (I realize the majority of my questions are about Windows, but seeing as the prompt I get is for grub I thought I would ask here first.) Any insight anyone has on this predicament would be greatly appreciated.

    Read the article

  • kernel 2.6.36 not booting

    - by Saumitra
    Hi, I m a newbie to kernel programming. I am trying to boot the kernel 2.6.36 on my ECG machine.It was working perfectly on 2.6.33.2. It is getting stuck on this step: ## Booting kernel from Legacy Image at 81000000 ... Image Name: Created: 2010-12-27 5:55:56 UTC Image Type: MIPS Linux Kernel Image (gzip compressed) Data Size: 1974278 Bytes = 1.9 MB Load Address: 80100000 Entry Point: 80104730 Verifying Checksum ... OK Uncompressing Kernel Image ... OK Starting kernel ... After this the system either resets or it hangs. I have also checked the configuration & set it properly.Please let me know.

    Read the article

  • How long does it take in practice to warm up large in-memory databases?

    - by Sim
    Companies such as Peak Hosting are offering 64 core machines with 512Gb RAM for $2K/month. This is a very interesting choice for in-memory databases such as Memcached/Redis as well as databases whose performance degrades rapidly when the data & indexes don't fit in RAM, such as MongoDB. My main concern with monster machines such as these is the time it takes to warm up an in-memory database. In my experience, theoretical metrics, e.g., that SATA can load 100Mb/sec, fall short of what happens in practice. Even at that rate, 100Mb/sec means that loading up 512Gb RAM machine from SATA disks can take over 1 1/2 hours (!). I am looking for real-world reports of warm-up times for machines with very large memory. Please, share details of the software on the machine, data size, storage configuration, e.g., SATA or SSD, network, hosting/cloud provider, if relevant, etc.

    Read the article

  • Readyboost on Windows 7 x64

    - by RobLaw84
    I'm thinking of bying 1 or more flash drives or an SD card to use with the readyboost function on by 64bit Windows 7 machine. I have a few questions regarding it before i put my hand in my pocket and buy anything. If i go ahead I would be using the fastest available flash/SD. I have 6GB of RAM current installed so will readyboost make any difference to boot / program load times? Will 2 x 2GB flash drives be quicker than 1 x 4GB or is the limitation on the motherboard? Would an SD card better than USB flash drive? thanks

    Read the article

  • Cannot open some websites

    - by Jayashree
    Hi all, I have a very specific problem. We have a wifi router at home which supports three laptops and a desktop. For the past month or so, I've been unable to open a number of websites on our HP desktop, Dell laptop and my Macbook. These include everything connected with http://wordpress.org and several others. The page simply refuses to load. I can't access some other websites as well. I've tried everything. We've rebooted the router, deleted all the cookies/download history, but nothing works. I've tried accessing these websites on IE, Chrome, Firefox and Safari. Strangely, when friends use their laptops on the same wifi connection, the websites open just fine. What do I do? I'm getting desperate here. Jayashree

    Read the article

  • Question about server usage, big community platform [closed]

    - by Json
    Possible Duplicate: How do you do Load Testing and Capacity Planning for Web Sites I’m working on a community platform writen in PHP, MySQL. I have some questions about the server usage maybe someone can help me out. The community is based on JQuery with many ajax requests to update content. It makes 5 - 10 AJAX(Json, GET, POST) requests every 5 seconds, the requests fetch user data like user notifications and messages by doing mySQL queries. I wonder how a server will handle this when there are for more than 5000 users online. Then it will be 50.000 requests every 5 seconds, what kind of server you need to handle this? Or maybe even more, when there are 15.000 users online, 150.000 requests every 5 seconds. My webserver have the following specs. Xeon Quad 2048MB 5000GB traffic Will it be good enough, and for how many users? Anyone can help me out or know where to find such information, like make a calculation?

    Read the article

  • What is the easiest way to make a backup of an entire hard disk

    - by Solignis
    Hi there, I got myself a dell laptop from the local computer store. Its a used machine with Windows Vista Home Basic on it. I want to load Ubuntu Desktop 10.10 though so I can do perl development. BUT I want to keep a copy of the entire harddrive with the dell utility partition and Windows Vista in case I want to go back. I was thinking I could image the drive but I not sure what to use, I don't have Ghost or anything, Someone had told me about Clonezilla. Would that work for me? Is it hard to use? Also I want to burn the data to a DVD or something more storable than a harddisk.

    Read the article

  • What is the easiest way to make a backup of an entire hard disk

    - by Solignis
    Hi there, I got myself a dell laptop from the local computer store. Its a used machine with Windows Vista Home Basic on it. I want to load Ubuntu Desktop 10.10 though so I can do perl development. BUT I want to keep a copy of the entire harddrive with the dell utility partition and Windows Vista in case I want to go back. I was thinking I could image the drive but I not sure what to use, I don't have Ghost or anything, Someone had told me about Clonezilla. Would that work for me? Is it hard to use? Also I want to burn the data to a DVD or something more storable than a harddisk.

    Read the article

  • minimum required bandwidth for remote database server

    - by user66734
    I want to build a small warehousing application for my company. We have a central warehouse which distributes to 8 sales points across the country. They insist on an in-house solution. I am thinking to setup a central mySQL db Linux server and have the branches connect to it to store sales. Queries to the db from the branches will be minimum, maybe 10 per hour. However I need all the branches to be able to store each sale data ( product ID, customer ID ) in the central db at peak time at most once every five minutes. My question is can I get away with simple 24mbps/768kbps DSL lines? If not what is the bandwith requirement? Can I rely on a load balancing router to combine additional lines if needed? Can you propose some server hardware specs?

    Read the article

  • OOMK kills mysql and apache when there is still a lot[?] of mem

    - by Flyer
    let me first say that I'm pretty new ti *nix systems and even more to server management. Anyway, I've got a little problem. I got VPS with 1gb mem, system is debian 6. I have few sites running on it, though some load can only be caused by one of them. Recently, OOMK started to kill mysql, causing wp and phpbb giving error that it can't connect to mysql server. Error itself is not good, especially if it happens at night and site becomes unavailable until I wake up and restart mysql. I have probably bad line in my cron which can be cause of it all (again, I'm new to it) */20 * * * * sync; echo 3 > /proc/sys/vm/drop_caches Well, if you need any information, let me know, since I don't really know which information can be useful here. Also, I'd like to know if it's not too bad to have above cron task.

    Read the article

  • Just installed Ubuntu 12.04. When booting, all I get is a black screen with cursor

    - by user66378
    Installation appears to go fine. After rebooting, I get my motherboard loading screens, but when it comes time for Ubuntu to boot, I just get a black screen with a blinking white underscore in the top-left - same as I got when waiting for the install CD to load, except it lasts forever. The only keypress it seems to recognize is ctrl+alt+del, which reboots. Letters don't register, function keys w/ or w/o modifiers do nothing. I've installed Ubuntu 12.04 twice and got the same error. The first time, I installed it as the only OS, and had it take up the whole disk. The second time, I installed Windows 7 first, then Ubuntu by specifying custom partitions. After this install, it would boot straight to Windows without showing grub. I used EasyBCD to add the Ubuntu installation to grub, and this got grub to show, and let me select it, but it led back to the same error described up top. I've had Linux Mint 11 and 12 installed on this PC, but was unable to get previous versions of Ubuntu to install (always had errors while installing, not after). Hardware: Intel Core i7-2600K Sandy Bridge 3.4GHz (3.8GHz Turbo Boost) LGA 1155 ASUS SABERTOOTH P67 (REV 3.0) LGA 1155 Intel P67 SATA 6Gb/s USB 3.0 ATX Intel Motherboard EVGA 01G-P3-1371-TR GeForce GTX 460 (Fermi) CORSAIR Vengeance 16GB (4 x 4GB) 240-Pin DDR3 SDRAM DDR3 1600 (PC3 12800) Western Digital RE4 WD5003ABYX 500GB 7200 RPM SATA 3.0Gb/s 3.5" Internal Hard Drive

    Read the article

  • Slow slef hosted wordpress website

    - by Integrati Marketing
    Hi All, we have a great site which has been humming along nicely for about 5 months and then in May it went from a page load speed time of 3-5 secs to now an agonising 15+ secs!!! The host has been really helpful and has even shifted the site to a new server which is faster! I guess seeing as though we do not have the insight or your expertise we would ask the Serverfault community and see what this crowd of experts could recommend? Appreciate any insight, thank you. site is here: integrati.com.au Cheers. :)

    Read the article

  • Monitoring/logging a malfunctioning internet connection

    - by Pekka
    I have a mysterious internet connection problem: Every 15-20 minutes, the connection will become very slow, and take 2-3 minutes for anything to load. I've had a technician from the ISP over here to test the hardware, and everything is in pristine condition. They have no other explanation than a configuration error on my machine, a possibility I can exclude 90% because I'm experiencing the same problems with another machine. I will have to monitor the situation now, and I would like to run a program that logs when internet connections become slow. I thought about putting something together using at and wget. Does anybody know of some other tool for this that does this out of the box - maybe something with an adjustable request frequency, logging connection speeds etc.?

    Read the article

  • What is wrong with this HTML5 <address> element? [closed]

    - by binaryorganic
    <div id="header-container"> <address> <ul> <li>lorem ipsum</li> <li>(xxx) xxx-xxxx</li> </ul> </address> </div> And the CSS looks like this: #header-container address {float: right; margin-top: 25px;} When I load the page, it looks fine in Chrome & IE, but in Firefox it's ignoring the styling completely. When I view source in firefox it looks like above, but in Firebug it looks like this: <div id="header-container"> <address> </address> <ul> <li>lorem ipsum</li> <li>(xxx) xxx-xxxx</li> </ul> </div>

    Read the article

  • which mail server is better suited for high volume? [closed]

    - by crashintoty
    I'm planning out a project (web/mobile app) that would require a mail server that could handle hundreds of thousands connections per hour (both IMAP/POP and SMTP) and has the ability to interface with PHP (or python or whatever) to dynamically create, delete and check for mailboxes? This is not for spam stuff, I just need my app to generate random mailboxes (and static/permanent ones too) to receive mail and process it for items listed on my service. The little research I've done so far has turned up courier, dovecot, cyrus and haraka. I think the ability to scale and/or load balance (I'm new to these terms, pardon me) would also be a requirement. Any ideas?

    Read the article

  • Why does my system slow down or freeze when there is heavy disk activity?

    - by user72270
    Im a first-time user to Ubuntu-12.04 with WUBI installation. My NoteBook Information : Dell vostro 3450 : i5 2410m, 3 gb ram, intel hd3000, amd 6630m hybrid. Surfing and playing games works flawlessly, however, I'm having huge problems when installing applications and generally copying and moving files. When doing so, system is significantly slower and freezes quite often (Firefox gets bluish, sometimes even black n white). I would say that Ubuntu allocates too much resources on file transfers and installing, but even these tasks are very slow. Here is very specific example : today, i tried to move 6 GB file from win 7 installation. It was good at first, i jumped to firefox but after a while firefox started to randomly turn bluish and mouse was randomly stopping working. It was gradually worse and worse and it got to a point when firefox black n whited and mouse wasn't working at all. I raged and went for some meal, when i got back screen was black. It probably unlogged me due to inactivity, when i pushed random button to bring screen to life i had to wait few minutes to let it show me only my screen background. No log in screen, just background and working mouse. NoteBook fan was working at 100 % so I assumed that file transfer was going on and I left it to work. Nothing then changed for a full hour so I hard rebooted it. File transfer unsuccessful, It transfered hardly 2 gigs. Is this normal ? What to do in these situations ? It didn't let me load system manager and not even terminal. Thanks.

    Read the article

  • Top down or bottom up approach?

    - by george_zakharov
    this is the closest I think I can get to define my problem. I'm an interface designer and now I'm working with a team of programmers to develop a new CMS for a heavy media site. I'm sorry if it's a very, very dumb question to ask here, but I really need some help. As I've started developing a specification list for a prototype it turned out a very big one. I do realize now that the client-side will be JS heavy, with lots of DnD and other "cool designer stuff". The CMS will even include a small project management system for its users, nothing big like Basecamp, but with a live feed of comments etc. The problem is the the team has now separated. Someone is proposing the existing backend solution used in other CMS, someone is proposing to rewrite everything from scratch. The point to keep the code is that it is faster, the point to rewrite is to make it better for the proposed design (include Node.js and other stuff I don't actually get). The question is — can the UI specs influence back-end? The guys that propose to use the existing solution did everything with the Yii framework (as far as I know), and they say that everything on server is not affected by this "interface coolness". Others say that it does, that even autosave can't work without server load. Please, if this is really incomprehensible, again, I'm sorry, and I'll happy to clarify it after your questions. Thanks in advance

    Read the article

  • I have just upgraded to 13.10 and i can not install any programs

    - by jason malitz
    I upgraded to Ubuntu 13.10 last night and i tried to install empathy chat client and this is what I see after the failed down load installArchives() failed: (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 397719 files and directories currently installed.) Removing xserver-common-lts-raring ... Removing 'diversion of /usr/lib/xorg/protocol.txt to /usr/lib/xorg/protocol-precise.txt by xserver-common-lts-raring' dpkg-divert: error: rename involves overwriting `/usr/lib/xorg/protocol.txt' with different file `/usr/lib/xorg/protocol-precise.txt', not allowed dpkg: error processing xserver-common-lts-raring (--remove): subprocess installed post-removal script returned error exit status 2 No apport report written because MaxReports is reached already Errors were encountered while processing: xserver-common-lts-raring Error in function: So how do I fix this issue

    Read the article

  • Slow self hosted wordpress website

    - by Integrati Marketing
    Hi All, we have a great site which has been humming along nicely for about 5 months and then in May it went from a page load speed time of 3-5 secs to now an agonising 15+ secs!!! The host has been really helpful and has even shifted the site to a new server which is faster! I guess seeing as though we do not have the insight or your expertise we would ask the Serverfault community and see what this crowd of experts could recommend? Appreciate any insight, thank you. site is here: integrati.com.au Cheers. :)

    Read the article

  • Unable to run Django on Mac OS X

    - by cybervaldez
    I'm working with a Django project on my Mac (running Leopard) and I want to show it to my team. I've already passed the neccessary port forwards from my router to my Mac's LAN IP address but it doesn't work. I've also tried running the XAMPP server since that always worked with my Windows XP computer but it still doesn't work. Whenever I type my > it's showing a Page Load Error. Is this possibly an issue with an Mac OS X configuration that I need to setup first to allow my port forwards to get in? It's my first time to do this with Mac, perhaps I need to configure something else in network preferences?

    Read the article

  • Django on Windows 2003 Slow Initially

    - by John
    I have setup Django to run on a windows 2003 server following the steps on the django wiki. Everything works fine and there are no errors. Only one instance of Django is setup on the server at the moment. However whenever the first page is requested it takes about 10 seconds to load the page. After this every page loads instantly. All my searches about speed issues with Django on windows refer to the local server, but not when using IIS and PyISAPIe. Thanks

    Read the article

  • Is is better to combine Apache for file manipulation and upload and Nginx for static file serving, or to use one of the two alone

    - by user1032393
    Based on my research, I've read that nginx is best and ideal for serving up static files and images. My application depends heavily on uploading of images and rewriting them, then serving them up. Given that I only have one VPS currently, it has been suggested that I use nginx to serve up the images and website, and reverse proxy to Apache (on the same VPS) to rewrite files with image magick and handle the file uploads. Which would be the best solution, Apache, Nginx, or Apache + Nginx? In terms of best solution, I'm looking at minimal average RAM consumption, while maintaining decent load speed of maybe sub 2 seconds?

    Read the article

< Previous Page | 416 417 418 419 420 421 422 423 424 425 426 427  | Next Page >