Search Results

Search found 18978 results on 760 pages for '3rd party library'.

Page 310/760 | < Previous Page | 306 307 308 309 310 311 312 313 314 315 316 317  | Next Page >

  • WD my cloud 4th is Super Slow

    - by Saduser
    I am using a WD my cloud 4Tb and I have read other posts about users complaining about getting only 10Mb per second. My problem is that I am getting about 100kb/s to transfer a 125gb iPhoto library. Estimated time is 11 days to transfer this file. This is unacceptable. On the back of the WD cloud I am getting a solid green light and from what I read this means that I am on a gigabyte network. I have mac book pro running Mac OS Mavericks. I have tried 4 different cables and turned off my router firewall. I don't run anti-virus nor any firewall on the mac. Other things I have checked: direct connection to both router and WD cloud device. Tried wireless but it is even slower. Previously I was able to transfer a 55Gb iPhoto library in 14 hours which I felt was acceptable. I figured it would take approximately double the time to transfer the 125gb file but 11 days is ridiculous. Any other suggestions? Anything else I can check (how to check it) what is the bottle neck?

    Read the article

  • Where to set Visual studio 2013 property macros

    - by marcp
    I'm a new VS user. I've received some sample C++ projects working with a 3rd party API. They were saved in VS2012 format, but I have VS 2013. After conversion I find that there is an API specific macro defined in the project properties in the "Linker|General|Additional Library Directories" category. If I click on 'edit' I can replace the macro with an actual path, but how do I establish what the macro points to? In other words, how does one create a macro usable in multiple projects?

    Read the article

  • Installing sqlite gem fails on AWS Linux instance with sqlite-devel libraries installed

    - by Scott
    Hi, I'm running an instance built off ami-595a0a1c. I am trying to install the sqlite3 (or sqlite) gem and it's failing with the below error: $ sudo gem install sqlite3 Building native extensions. This could take a while... ERROR: Error installing sqlite3: ERROR: Failed to build gem native extension. /usr/bin/ruby extconf.rb checking for sqlite3.h... no sqlite3.h is missing. Try 'port install sqlite3 +universal' or 'yum install sqlite3-devel' and check your shared library search path (the location where your sqlite3 shared library is located). extconf.rb failed * Could not create Makefile due to some reason, probably lack of necessary libraries and/or headers. Check the mkmf.log file for more details. You may need configuration options. Provided configuration options: --with-opt-dir --without-opt-dir --with-opt-include --without-opt-include=${opt-dir}/include --with-opt-lib --without-opt-lib=${opt-dir}/lib --with-make-prog --without-make-prog --srcdir=. --curdir --ruby=/usr/bin/ruby --with-sqlite3-dir --without-sqlite3-dir --with-sqlite3-include --without-sqlite3-include=${sqlite3-dir}/include --with-sqlite3-lib --without-sqlite3-lib=${sqlite3-dir}/lib Gem files will remain installed in /usr/lib64/ruby/gems/1.8/gems/sqlite3-1.3.3 for inspection. Results logged to /usr/lib64/ruby/gems/1.8/gems/sqlite3-1.3.3/ext/sqlite3/gem_make.out Typically, this just means you need to install the development libraries and everything is cool. However, I have installed the sqlite-devel packages and still no dice. Since this is the Amazon Linux instance, I'd rather not add more repositories than the ones Amazon provides if possible. What can i do to get this thing to compile? Thanks for any insight! From a brand new instance, here's what I've done: $ sudo yum install rubygems ruby-devel $ sudo gem update --system $ sudo gem install rails $ rails new app $ cd app $ rails server Could not find gem 'sqlite3 (= 0)' in any of the gem sources listed in your Gemfile. $ sudo yum install sqlite-devel $ sudo gem install sqlite (or sqlite3 -- same result) See breakage above

    Read the article

  • Separating user resources - Windows Server 2008 (Terminal Server)

    - by Christopher Wilson
    At the moment I am running a Windows Terminal Server 2008 for around 10 clients that use the server to run programs and access data. Is there anyway to separate the resources of each user so that they do not impact each other in terms of resources. User 1: Opens program User 2: Notices slow down I have looked into using Windows System Resource Manager but do not know if it provides what I need and if there are any other 3rd party tools that also provide this functionality. Any answer is appreciated. Server Specs: HP ProLiant ML110 G7 Processor: Intel® Xeon® E3-1220 (4 core, 3.1 GHz, 8MB, 80W, 1333/t) RAM: 12GB DDR3 ECC 1TB HDD

    Read the article

  • How to disable passive mode in linux ftp command

    - by nute
    I am using the "ftp" command of linux to send data to a 3rd party provider. This company states that we need to "Disable passive mode in your FTP client", and I confirm it doesn't work in passive mode. However, when I googled the linux command, I see that the "-p" flag is "the default now for all clients (ftp and pftp) due to security concerns using the PORT transfer mode. The flag is kept for compatibility only and has no effect anymore." How do I disable passive mode then? And, is it that bad?

    Read the article

  • Multi screens are not aligning properly

    - by Steve
    I am using (4) Samsung (SMB2230) screens. When I place my mouse on the first screen at the bottom right and pull it into the second screen the mouse jumps to the middle on the left of the second screen. Then it does the same on the 3rd and 4th screens. The biggest jump is between the 1st and 2nd then it tightens up between 2 and 3 then close between 3 and 4. This is a really pain because my mouse gets hung up and I have to move up and down to get it to change screens.I had (4) Dells screens before and this didn't happen. I'm using Windows 7

    Read the article

  • Backup hardware and strategy on distributed Windows Server 2008 network

    - by CesarGon
    This question is a follow up to this. We have a Windows Server 2008 R2 domain over a network that spans two different buildings, linked by a 100-Mbps point-to-point line. Over 60 users work in the organisation. We are planning to use DFS folders and DFS replication for file serving across the organisation. The estimated data volume is over 2 TB, and will grow at approximately 20% annually. The idea is to set up a DFS file server in each building and use DFS so that all the contents stay replicated over the 100-Mbps link. We are now considering backup hardware and strategies. We are Dell customers and, after browsing the online Dell catalogue, I can see a number of backup hardware options. My main doubts are the following: Would you go for a tape library, disk backup, or are there other options worth considering? Would you perform batch backups (i.e. nightly) or would you use continuous backup (i.e. while users are working)? Would you use a dedicated backup server to which the tape library (or any other backup device) is attached, or is there any other alternative way of doing things? My experience with backup hardware and overall setup is limited, so I appreciate any good piece of advice that you may have. Thanks.

    Read the article

  • Linux. Compare binary files

    - by frustratedCmpNoLongerUser
    I need to compare two binary files and get output in form for every different byte. So if file1.bin is 00 90 00 11 in binary form and file2.bin is 00 91 00 10 I want to get something like 00000001 90 91 00000003 11 10 what is the easiest way to accomplish the goal? Standard tool? Some 3rd party tool? inb4: cmp -l should be killed with fire,it uses decimal system for offsets and octal for bytes. "Consistency for losers" must be cmp's motto.

    Read the article

  • How do I find funny pictures?

    - by Hanno Fietz
    No, not lolcats. And I'm not really looking for a specific site, either. I have often wished that I had some funny picture to illustrate a presentation, a website, a post, an email, or something else. Google image search and stock photo services have hardly ever helped me, although that may be because I'm doing something wrong. Jeff Atwood seems to have no problem to find funny pictures for his codinghorror and stackoverflow blogs, as well as for the error messages on the trilogy sites. One of my favourites was this elephant. Other bloggers also seem to be quite good at it. I'm wondering if I simply lack the creativity or if there's sources or methods I don't know about. I could think of the following ways to get pictures, but I'm not sure whether this is really "how they do it". keep a collection of pictures that you stumbled upon and liked (takes quite some time to build up to a proper library), when you need a picture, there's one in there maybe have pictures on paper, too, like from magazines or ads. when you are looking for a picture, search online (Flickr, Google, stock photos). This has never really worked for me, I don't know why. produce the pictures yourself, i. e. have a good library of source material or find some online and apply some creativity and suitable software. I could imagine that this could work well once you have the practice.

    Read the article

  • LDAP Authentication for multiple AD Domains

    - by TrevJen
    I have 3 full trust domains (2 child and one root). I need to use LDAP to allow authntication for domain users. The trick is that I need the application to use an AD server for the child domain BUT proxy the LDAP query and authentication for the root domain. I see that it maty be possible with AD LDS and some trusts and synching, but it looks pretty hairy and overly complicated. The short of it is: 3 domains (Parent, ChildA, ChildB) My 3rd party app will need to use ChildA domain servers to authenticate either: a. a user in the parent domain or b. a user in the ChildB domain I already have full trusts between all domains, and regular NTLM authentication works fine (unless you are trying to authenticate with LDAP)

    Read the article

  • Windows 7 backup keeps trying to backup non-existent file and folders

    - by Ayusman
    My Windows 7 system backup keeps trying to back up 2 non existent file and one folder. I have double checked that these files do not exist. How does windows 7 try to backup that does not exist and then complain and fails the backup? Here is the messages: Backup encountered a problem while backing up file D:\Non library songs\Klub Arabia. Error:(The system cannot find the file specified. (0x80070002)) Backup encountered a problem while backing up file D:\Non library songs\Klub Arabia 2. Error:(The system cannot find the file specified. (0x80070002)) Backup encountered a problem while backing up file D:\Data\FRIENDS. Error:(The system cannot find the file specified. (0x80070002)) These file/folders may have been there at some point but have since been moved. Any idea how to solve this? Does this mean all other content has been backed up successfully? I have windows 7 professional 64bit and I am backing up my Win7 machine to an external hard drive. Ayusman

    Read the article

  • Sending SPAM free mail through my website

    - by Sara
    Hi, I've been battling with this issue for couple of months. I need to send bulk mail (not spam) through my social network to users in situations like newsletters, site invitations (when user imports their address book contacts) I'm using shared hosting and it limits 500 mails per hour. Even though i manage to send mails most of them end up in user's spam box. After researching these are the solutions that i finally came up with. 1) Use Google Apps SMTP (http://www.google.com/apps/intl/en/business/features.html) 2) Move into VPS 3) Use shared hosting with throttle enabled Please advise me on what to choose. Will using Google Apps prevent mail being sent as spam? I can't use other 3rd party SMTP like iContact or Aweber as "invitation sending script" will send emails to thousands of contacts, depending on user's addressbook. Thanks in advance

    Read the article

  • SharePoint 2010 Licensing Costs

    - by akil.franklin
    We will be implementing a public-facing website in SharePoint 2010 and I have a few questions regarding licensing: Is there any (relatively) reliable pricing information available for SharePoint 2010? What about rumors? What edition of SharePoint 2010 would be appropriate for a publicly facing website (in 2007, you needed Enterprise for this, but it seems that WCM functionality is included in Standard in 2010)? What would be a reasonable number to budget for SharePoint 2010 licensing for a publicly facing website? Note: I have tried asking Microsoft directly. Unless you are a volume license customer, they direct you to a reseller (like CDW). Unfortunately, none of the resellers have the pricing for 2010 yet. The sku isn't even in their system. I was able to get in touch with the Microsoft Pre-Sales team and they confirmed that the price list will for 2010 will be published on May 3rd (or thereabouts), but they weren't able to give me a price. Thanks in advance for your help!

    Read the article

  • Sharepoint/WSS Reporting Services Integration woes

    - by mhollers
    after a number of failed attempts i seem to have successfully installed the Reporting services add-in to my WSS farm. However, I seem to be missing most of the enhanced functionality eg no report library template, no report center site template. the only additional functionality available is the report viewer web part. background: 2 server WSS 3.0 farm with CA (Central admin) WFE (web front end) and reporting services addin installed on 1, and SQL05 SP2 with Reporting services (RS) and all databases installed on other. I have a VM environment set up and have rolled this back and repeated a number of times. I have configured RS within CA and activated 'Report Server INtegration Feature'. Within the 'site settings' I have a 'Reporting Services' heading with a 'manage shared schedules' item/link, not sure if there should be other options? I was of the understanding that to view reports within sharepoint i could either create a new site using the 'report center' template or add a report library to an existing site, neither of which seems available I am at a loss as to what to do, as all online information seems to do with dealing with installation issues/errors, which i seem to have eventually got past

    Read the article

  • Macro to manage sport ranking and calendar?

    - by Ale
    I need to write a macro to manage ranking and calendar for curling turnament. The event will follow the Shenkel system first match determined by general draw after that every team has played one match is possible to determine the first ranking second match determined by the rule: 1st vs. 2nd - 3rd vs. 4th - 5th vs. 6th and so on after that every team has played two matches is possible to determine the second ranking and so on until the end (3 to 5 matches normally). Another rule is that from the second match is not possible to play against a team that I played before! I was thinking to use MS-excel but also Calc (both LibreOffice/OpenOffice) should be fine. Thanks in advanced

    Read the article

  • Loading dependencies for custom puppet functions

    - by Ben Smith
    I have written a custom puppet function, which is working fine, that depends on the cloudservers gem (a Rackspace client library). This is fine if I have pre-installed the gem on a server before running puppet but totally breaks if I have not installed the gem as the function seems to be run during the 'compilation' sweep, well before my package definition is realised. Here's what my .pp looks like, with get_hosts the function that requires the cloudservers gem. package { "rubygems": ensure => installed, provider => "gem"; } package { "cloudservers": ensure => installed, provider => "gem", require => Package["rubygems"]; } class hosts::us { $hosts = get_hosts("us") hostentry { $hosts: } } define hostentry() { $parts = split($name, ',') $address = $parts[0] $ip = $parts[1] $aliases = $parts[2] host{ $address: ip => $ip, host_aliases => $aliases } } Is there a way to stop the function getting run so early, or at least having it's run depend up the library being installed. Alternatively, is there a way that I can add dependencies somewhere in the functions folder that will be available to the function?

    Read the article

  • Internet Explorer 6 Encountered An Error ntl.dll Issue on XP and Win2000

    - by Gary B2312321321
    As we have some PC's still running Windows 2000 and this is beyond our control we use IE 6 so we can keep the IE platform standard. Recently on some XP and Win2000 PCs IE has been crashing with the Encountered a Problem and Needs to close do you want to send a message to Microsoft error when entering some sites. This happens even after a format reinstall with minimal software load (Office 2000, some avaya software, kaspersky, tapiex.dll). The module pointed to in the error report is ntl.dll. There are lots of reports over the net of the problem but has anyone resolved this issue? (Latest IE6 updates are installed). Also please note there were no 3rd party IE addons, spyware, viruses or adware Hope someone can help.

    Read the article

  • Set Display Refresh Rate in OSX w/o External Utilities

    - by codedonut
    I have an iMac and an LG Flatron connected as a secondary monitor. The recommended resolution for the flatron is 1680x1050 @ 65.290 Hz (horiz), 59.954 Hz (vert). For some reason, OSX is choosing a slightly different set of scan rates and this is currently my best guess of why the monitor goes into power saving mode when connected to the iMac (but works fine on a PC). Now, I resolved this by installing switchResX and fudging the scan rates according to the specs in the manual. But how does one change these rates w/o 3rd party tools? Which config files need editing? Thanks

    Read the article

  • PHP enable sqlite phpinfo states --without-sqlite

    - by Jahmic
    I've seen similar questions but none that address my situation adequately. I'm running Apache and PHP 5.3.6 on a amazon cloud server. phpinfo keeps stating that sqlite is disabled. At least that what it seems from the configure line: './configure' ... '--without-sqlite' In other parts of phpinfo() output: PDO PDO drivers mysql, sqlite PDO Driver for SQLite 3.x enabled SQLite Library 3.6.20 sqlite3 SQLite3 support enabled SQLite3 module version 0.7-dev SQLite Library 3.6.20 Directive Local Value Master Value sqlite3.extension_dir no value no value and at least one the following PHP commands fail: if (!extension_loaded('SQLite') OR !function_exists('sqlite_open')) Yum install states that both sqlite and pdo-lite are already installed. I've tried to enable sqlite by editing my local php.ini by adding: ; Enable sqlite3 extension module extension=sqlite3.so I've checked the main php.ini (/etc/php.ini) and there is nothing specific about disabling it. In fact, there is a sub-conig file loaded in php.d that also specifies this extension as well as another for the pdo-sqlite I'm running of things to look for or try. Any suggestions. How do I find where the PHP configure is stated? Thanks

    Read the article

  • OSX 10.6 goes unresponsive

    - by mjb
    This behavior continues to perplex me. My MBP, running 10.6.7, stops responding to all Apple-based software. Whatever software I have open remains open (Terminal, iTunes, Safari), but if I try to use the F-shortcuts or launch any OSX-based software not already open (System Preferences for example) it just bounces in the dock then never launches. I also cannot reboot without hard rebooting. I left terminal open, so I see the following in /var/log/system.log Jun 25 19:39:02 mjb-2 com.apple.ReportCrash.Root[59432]: 2011-06-25 19:39:02.585 ReportCrash[59432:7f1f] Saved crash report for CoreServicesUIAgent[59576] version ??? (???) to /Library/Logs/DiagnosticReports/CoreServicesUIAgent_2011-06-25-193902_localhost.crash Jun 25 19:39:02 mjb-2 com.apple.ReportCrash.Root[59432]: 2011-06-25 19:39:02.586 ReportCrash[59432:b10f] Saved crash report for quicklookd[59571] version ??? (???) to /Library/Logs/DiagnosticReports/quicklookd_2011-06-25-193902_localhost.crash Two requests: (1) please don't send this off to the Apple area so it can die a slow painful rotting death of tumbleweed. (2) Suggest what I should kill -9 or logs to look at to cut this sh*$ out. Cheers, mjb

    Read the article

  • how to rotate one squid user among multiple IPs based on number of requests processed by each IP

    - by Arvind
    I want to set up a Squid ACL in the following manner-- For example, my Squid Proxy Server has 10 IP addresses- now I have a user 'demouser'. I want that for the very first request sent to 'demouser' this user uses IP address #1, for the second request it uses IP address #2, for the 3rd request of the day it uses IP address #3 and so on till it uses up all IPs. One more level of control I would like is that once the user has used up all available IP addresses once per address, then it does not allow the proxy request to go through. How do I set up such a configuration on Squid Proxy server ACL? Even a document or how to would be very helpful. The official wiki talks about one 'weird' case- choosing an IP address based on time of day the request was made to the proxy server. The other cases are all regular use cases which are not even remotely near my requirement as specified above.

    Read the article

  • Is there a mousegestures add-on for Google Chrome?

    - by kgrad
    With the release of chrome 3.0 I am again considering switching to chrome as my default browser. The only thing stopping me, and it has been stopping me since chrome 1 came out, is the lack of the mousegestures add-on that i have in firefox. Mouse Gestures have become so routine for me that I simply can't use another browser that doesn't have them. There are ways to kinda emulate mouse gestures using 3rd party programs like gmote but they are not the same and not quite as good. I know that chrome developer has add-ins but I havent been able to find a mouse-gestures one. I'm fairly confident that many people want one. So, does a mousegestures add-on exist for chrome? bonus points if there is a firebug/xmarks add-in as well! thanks.

    Read the article

  • MacBook Pro screen goes dark

    - by Mike M
    I've had my MacBook Pro for two years now...no problems (it has had 3rd party RAM from the get go). Today, I'm copying a particularly large VM from an External disk drive to local MacBook disk. It has about 3GB to go and I take off to do some other things and when I come back my screen is "dark". The computer is still on but I can't see anything. I forced a reboot by holding down the power button, it starts up with the "chimes", but still no screen. I've done this several times. Any ideas? Do you think the hard disk activity caused it to get too hot?

    Read the article

  • Apache + PHP via FastCGI

    - by Wilco
    I'm running into some problems while attempting to run PHP via FastCGI in Apache. I have the FastCGI module loaded, but get the following error when attempting to load a page: The requested URL /fastcgi/php54.fcgi/index.php was not found on this server. Somewhere, it seems that the script to be executed is appended to the executable without any spaces. Is this where the problem likely is? Below I've included snippets from my Apache configuration (hopefully this is enough): LoadModule fastcgi_module libexec/apache2/mod_fastcgi.so FastCgiIpcDir /var/run/fastcgi AddHandler fastcgi-script .fcgi FastCgiConfig -autoUpdate -singleThreshold 100 -killInterval 300 AddType application/x-httpd-php .php ScriptAlias /fastcgi/ /Library/WebServer/FCGI-Executables/ <Directory "/Library/WebServer/FCGI-Executables"> Options +ExecCGI SetHandler fastcgi-script Order allow,deny Allow from all <VirtualHost *:80> ServerName www.somedomain.com ServerAdmin [email protected] DocumentRoot "/Web/www.somedomain.com" DirectoryIndex index.html index.php default.html CustomLog /var/log/apache2/access_log combinedvhost ErrorLog /var/log/apache2/error_log Action application/x-httpd-php /fastcgi/php54.fcgi <IfModule mod_ssl.c> SSLEngine Off SSLCipherSuite "ALL:!aNULL:!ADH:!eNULL:!LOW:!EXP:RC4+RSA:+HIGH:+MEDIUM" SSLProtocol -ALL +SSLv3 +TLSv1 SSLProxyEngine On SSLProxyProtocol -ALL +SSLv3 +TLSv1 </IfModule> <Directory "/Web/www.somedomain.com"> Options All -Indexes +ExecCGI +Includes +MultiViews AllowOverride All <IfModule mod_dav.c> DAV Off </IfModule> <IfDefine !WEBSERVICE_ON> Deny from all ErrorDocument 403 /customerror/websitesoff403.html </IfDefine> </Directory> </VirtualHost> ... and this is the executable: #!/bin/sh PHP_FCGI_CHILDREN=1 PHP_FCGI_MAX_REQUESTS=5000 export PHP_FCGI_CHILDREN export PHP_FCGI_MAX_REQUESTS exec /opt/local/bin/php-cgi54

    Read the article

  • Reduce Windows DNS Service caching on Window

    - by Nick G
    I'm struggling with DNS caching issues on a Windows based LAN. I've noticed that if I change a DNS record on a domain hosted by a 3rd party nameserver, that I always seem to be the very last person to see the change happen. I can often query the domain using a service which checks propagation around the world like www.whatsmydns.net but I usually find that all other DNS servers are correct and it's only my own server which has the old IP - even 8-12 hours later. This is an issue for us as we're website developers and often making changes to DNS records so these huge delays are frustrating. It seems to be because our primary domain controller server (+Active Directory & DNS) on our LAN (which is also our local DNS server) caches records for AGES (Way beyond it's published TTL). How can I stop the Windows DNS server from caching, or reduce the caching to only an hour or so?

    Read the article

< Previous Page | 306 307 308 309 310 311 312 313 314 315 316 317  | Next Page >