Search Results

Search found 2839 results on 114 pages for 'amazon'.

Page 1/114 | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • Best & Free 4 Amazon Affiliate Plugins For WordPress

    - by RogerB
    Are you gearing up to earn from your web blog? Are you writing product reviews & would like income by doing just that? Amazon associates allows you to do this. There are various plugins available out there..both free & premium to integrate the product links on your blog pages that are related to your blog [...] Related posts:Amazon.com WordPress Plugins 10 useful SEO Plugins For WordPress The Top 10 WordPress RSS Plugins

    Read the article

  • Switching from Amazon EC2 instance-store to EBS Volume

    - by Adam
    Hi, I have an Amazon EC2 instance that is using an instance-store as its root device. It has no EBS volumes attached to it. It has a database and a running web application on it. If I understand correctly this is a bad setup as I would lose all the data on the instance if it were to reboot. I would like to correct this mistake. I'd like to move all the data on the running instance to a new EBS volume and make that new volume the root device. How do I go about doing this? Thanks!

    Read the article

  • Check If You Are Eligible To Try Amazon Fire TV Free for 30 days

    - by Gopinath
    Re/Code first broke the story of Amazon offering its customers to try Amazon Fire TV free for 30 days. The Amazon Fire TV costs $99 but with this offer you get a chance to try it for 30 days without paying any money. After 30 days if you are interested you can keep it with you by paying $99 otherwise you can return it to Amazon. All the costs associated with shipping to your home as well as returning back to Amazon are covered by the offer, which means it’s a real FREE offer to try! This offer is not available for everyone, but selected customers are receiving emails from Amazon. If you are interested to try free Amazon TV either you may wait to receive an offer email or click on this link to check if you are eligible. I verified it today and it says that I’m not eligible for this offer. Try your luck and all the best.

    Read the article

  • Amazon S3 Tips: Quickly Add/Modify HTTP Headers To All Files Recursively

    - by Gopinath
    Amazon S3 is an dead cheap cloud storage service that offers unlimited storage in pay as you use model. Recently we moved all the images and other static files(scripts & css) of Tech Dreams to Amazon S3 to reduce load on VPS server. Amazon S3 is cheap, but monthly bill will shoot up if images/static files of the blog are not cached properly (more details). By adding caching HTTP Headers Cache-Control or Expires to all the files hosted on Amazon S3 we reduced the monthly bills and also load time of blog pages. Lets see how to add custom headers to files stored on Amazon S3 service. Updating HTTP Headers of one file at a time The web interface of Amazon S3 Management console allows adding custom HTTP headers to one file at a time  through “Properties”  window (to access properties, right on a file and select Properties menu). So if you have to add headers to 100s of files then this is not the way to go! Updating HTTP Headers of multiple files of a folder recursively To update HTTP headers of multiple files in a folder recursively, we can use CloudBerry Explorer freeware or Bucket Explorer trail ware applications. CloudBerry is my favourite as it’s a freeware and also it’s has excellent interface to access Amazon S3 from desktops. Adding HTTP Headers with CloudBerry application is straight forward – right click on the required folders and choose the option “Set HTTP Headers”. Download CloudBerry Explorer This article titled,Amazon S3 Tips: Quickly Add/Modify HTTP Headers To All Files Recursively, was originally published at Tech Dreams. Grab our rss feed or fan us on Facebook to get updates from us.

    Read the article

  • Including Amazon affiliate link to Amazon homepage

    - by zpesk
    I want a link on my Web site that sends the user to the Amazon homepage, but includes my affiliate code so that any subsequent purchases get credited to my account. I have been browsing through the Amazon Associates portal and there only seems to be a way to link to a specific product. I found this link on daringfireball.net: http://www.amazon.com/exec/obidos/redirect?tag=daringfirebal-20&path=subst/home/home.html If I replace his affiliate code (daringfirebal-20) with mine, will this work? is this documented anywhere?

    Read the article

  • Fail rate of EBS snapshots and AMIs?

    - by user784637
    According to Amazon the fail rate of EBS volumes is: As an example, volumes that operate with 20 GB or less of modified data since their most recent Amazon EBS snapshot can expect an annual failure rate (AFR) of between 0.1% – 0.5%, where failure refers to a complete loss of the volume http://aws.amazon.com/ebs/ However, I was curious to know the fail rate of EBS snapshots and private AMIs that a system admin would take. Since the EBS snapshots and AMIs are stored in Amazon s3, is it safe to assume that the likelihood you cannot rollback to a previous snapshot or AMI the same likelihood that a file gets lost in s3? Amazon S3’s standard storage is: ... Designed to provide 99.999999999% durability and 99.99% availability of objects over a given year. http://aws.amazon.com/s3/

    Read the article

  • How do I host multiple SSL websites on a single EC2 instance using Amazon Elastic Load Balancers?

    - by Developr
    If I have 3 separate websites which all require SSL (separate certificates) that I want to host on the same EC2 instance(s) across multiple availability zones so that we have the ability to scale and be highly available, how do I achieve this using ELBs in my Amazon VPC? Each site requires a separate IP address, so I have added multiple private IPs to the EC2 instance, but I am unsure how to bind the ELB to a certain IP on the instance. I was also able to setup multiple ELB pointing to the same instance, but again, I am not seeing any way to bind each ELB to a separate IP on the instance. If this is not possible, what is the best option? Run each site on a separate EC2 instance / ELB combo (expensive and harder to maintain) Give each site a separate public IP and use Route 53 to do the load balancing (seems like a hack) Use a different load balancer option such as HAProxy that should be able to work like a normal load balancer appliance. Please help!

    Read the article

  • Reducing latency for different geographic regions on Amazon Cloud

    - by Shoaibi
    I have got an application which has three components Application code : Amazon EC2 US-EAST-1 instance Application images, and other static data : Amazon S3 with CloudFront Application Database : Amazon RDS In short i need something like Cloud Front for EC2. In long, people using this application from a different region say middle east will have faster static content downloading due to Cloud Front but there would be a lot of latency in communicating to EC2 instance. I want to use a budget friendly way of enhancing this. Launching Amazon Instances in every region that offer is sure a choice, but isn't really cheap, so would try to avoid it unless its last resort. Also say if my clients also need to communicate to the RDS database directly, is there some kind of solution which gives that kind of functionality mentioned above, but for RDS?

    Read the article

  • How do I import Amazon MP3s with Banshee and the new Amazon Cloud Player?

    - by adempewolff
    Banshee's Amazon MP3 Import extension until recently allowed seamless importing of songs purchased from Amazon MP3. It did this by a)opening .amz files and using them to connect to and download the purchased files from Amazon's servers, and b) using hooks in Banshee's built-in browser to automatically recognize and open the .amz files when clicked on in the browser. However, recently this functionality stopped working. Banshee will display Contacting Server in the lower left hand corner for a little while and then stop. Furthermore opening the Amazon Cloud Player in the Banshee browser or any other browser on a Linux system to manually download the .amz file now results in the message: On Linux systems, Cloud Player only supports downloading songs one at a time. To download your music, deselect all checkboxes, select the checkbox for the song you want to download, then click the "Download" button. How can I get around this and import my purchased music into Banshee as I used to?

    Read the article

  • How to config Amazon Route53 working without www in sub-domain

    - by romuloigor
    edit: Amazon now supports this. http://aws.typepad.com/aws/2012/12/root-domain-website-hosting-for-amazon-s3.html I have my domain config in Route53 at Amazon AWS exec ping command in my domain without www $ ping gabster.com.br ping: cannot resolve gabster.com.br: Unknown host exec ping command in my domain with www $ ping www.gabster.com.br PING s3-website-sa-east-1.amazonaws.com (177.72.245.6): 56 data bytes 64 bytes from 177.72.245.6: icmp_seq=0 ttl=244 time=25.027 ms 64 bytes from 177.72.245.6: icmp_seq=1 ttl=244 time=25.238 ms 64 bytes from 177.72.245.6: icmp_seq=2 ttl=244 time=25.024 ms Route 53 - Create Record Set - Name: [ ].gabster.com.br Set CNAME value: www.gabster.com.br DISPLAY ERROR "RRSet of type CNAME with DNS name mydomin.com is not permitted at apex in zone mydomin.com"

    Read the article

  • How to install ceph on EC2 Amazon Linux AMI

    - by takaomag
    I want to test Ceph (a distributed network storage and file system) on some EC2 hosts which is derived from Amazon Linux AMI (amzn-ami-2011.09.2.x86_64-ebs). The kernel version is 3.2 and btrfs is enabled. But kernel config options related to Ceph (CONFIG_CEPH_FS and CONFIG_BLK_DEV_RBD) seems to be disabled. I have to make a new kernel and register it to amazon ? Or, does someone know more easy way ?

    Read the article

  • Automatically Login & Startup A Windows Program On Amazon's EC2 Service

    - by darkAsPitch
    How can I automatically start a program on Amazon's EC2 Windows 2008 web servers? For example, if I wanted to test the "Digg effect" on a web page of mine, how could I open 100 windows 2008 servers at once, each loading one (or two?) instances of the firefox web browser? I have placed a sample batch file in the windows startup folder that echos the time it was called, but it is only started when I actually login remotely via the remote desktop protocol. I don't want to have to login to 100 servers to get my software to run :P What can I do? I am using this Windows 2008 Datacenter, Amazon-supplied AMI specifically: ami-a2698bcb

    Read the article

  • I/O intensive MySql server on Amazon AWS

    - by rhossi
    We recently moved from a traditional Data Center to cloud computing on AWS. We are developing a product in partnership with another company, and we need to create a database server for the product we'll release. I have been using Amazon Web Services for the past 3 years, but this is the first time I received a spec with this very specific hardware configuration. I know there are trade-offs and that real hardware will always be faster than virtual machines, and knowing that fact forehand, what would you recommend? 1) Amazon EC2? 2) Amazon RDS? 3) Something else? 4) Forget it baby, stick to the real hardware Here is the hardware requirements This server will be focused on I/O and MySQL for the statistics, memory size and disk space for the images hosting. Server 1 I/O The very main part on this server will be I/O processing, FusionIO cards have proven themselves extremely efficient, this is currently the best you can have in this domain. o Fusion ioDrive2 MLC 365GB (http://www.fusionio.com/load/-media-/1m66wu/docsLibrary/FIO_ioDrive2_Datasheet.pdf) CPU MySQL will use less CPU cores than Apache but it will use them very hard, the E7 family has 30M Cache L3 wichi provide boost performance : o 1x Intel E7-2870 will be ok. Storage SAS will be good enough in terms of performance, especially considering the space required. o RAID 10 of 4 x SAS 10k or 15k for a total available space of 512 GB. Memory o 64 GB minimum is required on this server considering the size of the statistics database. Warning: the statistics database will grow quickly, if possible consider starting with 128 GB directly, it will help. This server will be focused on I/O and MySQL for the statistics, memory size and disk space for the images hosting. Server 2 I/O The very main part on this server will be I/O processing, FusionIO cards have proven themselves extremely efficient, this is currently the best you can have in this domain. o Fusion ioDrive2 MLC 365GB (http://www.fusionio.com/load/-media-/1m66wu/docsLibrary/FIO_ioDrive2_Datasheet.pdf) CPU MySQL will use less CPU cores than Apache but it will use them very hard, the E7 family has 30M Cache L3 wichi provide boost performance : o 1x Intel E7-2870 will be ok. Storage SAS will be good enough in terms of performance, especially considering the space required. o RAID 10 of 4 x SAS 10k or 15k for a total available space of 512 GB. Memory o 64 GB minimum is required on this server considering the size of the statistics database. Warning: the statistics database will grow quickly, if possible consider starting with 128 GB directly, it will help. Thanks in advance. Best,

    Read the article

  • Amazon EC2 migration from one region to the other

    - by Gnanam
    I'm using the following Amazon EC2 resources in the US East (Virginia) region: 1 Running Instance 1 Elastic IP 2 EBS Volumes 100 EBS Snapshots 1 Key Pair 2 Security Groups 5 My Own AMIs (customized based on my application stack) My instance is based on Linux distribution (CentOS) and my AMIs are S3-backed. Both EBS volumes are mounted on this running instance. We're planning to migrate our deployment to US-West region. Because Amazon EC2 resources are not shared across regions, my questions are: What are all the factors that I need to consider in advance? What are all the recommended & different ways of migrating each EC2 resources from one region to the other? Are there any hidden risks involved during and/or after the migration? Experts ideas/suggestions/recommendations on this are highly appreciated.

    Read the article

  • Amazon EC2 root defaults on EBS

    - by CodeShining
    I'm trying to understand why when launching a new instance Amazon defaults to EBS (8gb root) instead of instance storage. Why do they sell instance storage then if it's not used also to boot the base system? Is it safe to uncheck delete on termination, make it bigger (~50GiB) and keep all files on that EBS instead of creating a new one to make sure data will persist and it will also be usable by another instance?

    Read the article

  • amazon dynamoDB or MySQL for storing large arrays inside each row

    - by Logan Besecker
    I am trying to decide which database I should use for an application I'm making. I was leaning toward dynamoDB because of its scalability, but then I read in the documentation which said: there is a limit of 64 KB on the item size although it looks like MySQL has a similar restriction documented here This application will be storing a lot of data in two arrays, which could contain upwards of 10,000-100,000 strings in each. I estimate that these strings will each be somewhere around 20 characters long, so each element of the array will be around 40bytes and each array could be around 4MB. Given this predicament, what database on amazon AWS would you use; or how would you get around the limit of size per row? Thanks in advance, Logan Besecker

    Read the article

  • Earn $10 for each friend you refer to FREE Amazon Mom Account

    - by Gopinath
    If you are looking for options to earn some handful of referral bonuses then here is a great deal from Amazon. You can earn $10 for every friend/family member you refer to join free Amazon Mom trail account. What is Amazon Mom by the way? Amazon Mom is a free membership program created especially for parents and caretakers of small children. It gives free 2 days shipping of products purchased on Amazon.com, 20% discount on diapers, wipes and other baby stuff.  Though the name says Mom, it’s open to any one who has children. It does not matter whether you are father, grand parent, aunty or uncle. You can join the program and avail all the benefits of the program. It costs nothing to join FREE 3 months trail Amazon Mom costs $79/year, but anyone can join FREE 3 months trail and explore it with no cost. At the end of your 3 months trail you may either continue the program by paying the required amount or just opt out of it without any charges. You can learn more details about the Amazon Mom program benefits over here. To earn bonus you need to refer friends to join the free 3 months trail and as soon as they join Amazon will automatically credit $10 bonus to your Amazon.com account. Did you ever make money? Couple of weeks ago I saw this promotion and referred my friends. They loved the program as it gives a lot of discounts on baby diapers, wipes and they immediately joined. Within a 10 days they joined the program, Amazon sent emails to me confirming referral bonus. Here is a screen grab of one such referral email and my Amazon bonus are adding up every day as referrals pulling more people in to this program     How to refer friends and earn bonus? So you are ready to refer your friends and here are the steps to be followed Sign in to your Amazon.com account Go to Amazon Mom Referral page and copy the referral link displayed on screen Start sharing the link with your friends and request them to join the free trail If you own a blog or website, write about the program and let your readers know about it. You can also have a image banner on your website with referral link. Facebook and Twitter are the other two places where you can share the referral links and bring your friends on board. Know the rules and don’t gamble Amazon Mom referral program has few conditions that must be satisfied. Make sure that you read and understand all of them. Final and most important one is not to gamble Amazon! Yes, don’t play tricks like referring yourself or creating fake Amazon Mom accounts  in order to earn money. By gambling you may be able to cheat Amazon for a while, but as soon as Amazon detects the fraud  you will be booted out of their system.  Being on the good side always takes you in right direction and helps you earn money.

    Read the article

  • AMIs in Amazon EC2

    - by Jack of Trades
    I really like the Amazon EC2 environment, and thought I'll spend a bit of time playing around with various types of public (Windows!) AMI servers. But testing has been a bit, well, questionable. Some of my findings: It's very difficult to know what exactly a specific public EC2 image is supposed to be doing. Many images come with little to no information. I can't seem to find the passwords to log onto various windows images. Why are they public if they can't be used!? Lots of images are based on S3, and not EBS backed. This is very annoying, as S3 takes a lot longer to do pretty much anything (stop, image etc.) I am only testing images here, so of-course I don't question the value of S3 for other attributes. The description of what an image does is almost useless and many times confusing. Have others come across these EC2 issues. Again, my interest was to just play around with public images for testing/experimentation/etc, and therefore these issues may not be too relevant for more normal EC2 deployment uses.

    Read the article

  • Processing-time billing in Amazon EC2

    - by Rafael Almeida
    Hi all! I think my question is fairly basic, but I would like a clarification: in the Pricing part of AWS we can see that Amazon charges people around .10 by the 'instance computing hour'. I've seen in a blog post somewhere (can't remember where exactly, and even if I did I think it was in Portuguese anyway) that this way your minimum monthly payment would be $72 (= .10 $s/hour x 24 hours x 30 days). Is this correct? (I don't think it is!) In my understanding is that this 'virtual computing time' is only used when your machine is actually doing something (serving pages, serving the admin via ssh, whatever), so real billable usage would be less than 720 hours/month in most webserver scenarios. Is my view correct? If it is, then it leads me to another question: is it economically interesting to buy access to one of these instances for testing? I mean, would I have the 'freedom' to 'forget' about it for a month and receive a very-close-to-zero (as in, a few cents) bill? Do you do it/know of anybody who does? Any thoughts on the matter (as in, "yes, it's a good idea", or "yes, but there's this 'gotcha': ...", or "no, nobody does it because of...")? PS: sorry for the loong question text. I highlighted the main questions for easy view. Also, I'm not sure if this question is actually more than one and if it's desirable for the community, so, sorry if it is too! Thanks in advance!

    Read the article

  • Amazon Kindle Fire User Agent String

    - by Gopinath
    Today I was searching for Amazon Kindle Fire user agent string so that I can trick websites as if I’m browsing using Kindle Fire. To my surprise I found the following two variants of user agents listed on blogs but not sure which one is right or if Kindle Fire generating two types of User Agent strings. The first one is given by the prominent blogger and WSJ tech columnist Amit Agarwal and I vote for him as he is a highly reputed person. Mozilla/5.0 (Linux; U; Android 2.3.4; en-us; Kindle Fire Build/GINGERBREAD) AppleWebKit/533.1 (KHTML, like Gecko) Version/4.0 Mobile Safari/533.1 The second variant is found on this website and I’m not sure about the authority of the blogger. Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_3; en-us; Silk/1.1.0-80) AppleWebKit/533.16 (KHTML, like Gecko) Version/5.0 Safari/533.16 Silk-Accelerated=true This article titled,Amazon Kindle Fire User Agent String, was originally published at Tech Dreams. Grab our rss feed or fan us on Facebook to get updates from us.

    Read the article

  • Can't connect to EC2 instance in VPC (Amazon AWS)

    - by Ryan Lynch
    I've taken the following steps: Created a VPC (with a single public subnet) Added an EC2 instance to the VPC Allocated an elastic IP Associated the elastic IP with the instance Created a security group and assigned it to the instance Modified the security rules to allow inbound ICMP echo and TCP on port 22 I've done all this and I still can't ping or ssh into the instance. If I follow the same steps minus the VPC bits I am able to set this up without issue. What step am I missing?

    Read the article

  • How can I script an alert for when my Amazon Web Service usage goes above a certain amount?

    - by frabcus
    We're using S3, SimpleDB and SQS on quite a complicated project. I'd like to be able to automatically track their usage, to be sure we don't suddenly spend large amounts of money when we didn't intend to (perhaps because of a bug). Is there a way of reading the usage figures of all Amazon Web Services and/or the current real time dollar cost of an account from a script? Or any service or script which provides alerts based on that?

    Read the article

  • Time between AWS Notifying of Scale Down and Terminating instance

    - by SteveEdson
    Here is the scenario, there are multiple EC2 instances behind a load balancer. When traffic dies down, the SCALE_DOWN policy is triggered from a CloudWatch alarm. What I would like, is for the instance that is going to be terminated, or a separate server altogether, to be able to run a quick script that will execute a few commands to ensure all data has been transferred. My initial question was going to be how can I send a notification when an instance is going to be terminated by an auto scale, SCALE_DOWN policy. But then I saw this question Amazon EC2 notifying the instance when the autoscale service terminates it. If the notification is sent, how much time is there before the instance actually gets terminated? Are there any parameters to specify this time? Would it be a better idea to notify an instance that it is no longer needed, and get the instance to terminate itself once it has finished running the final script? Or, am I making this into a bigger problem than it actually is, and theres a far simpler solution?

    Read the article

  • Creating url from the cloudfront aws

    - by GroovyUser
    I have my js files which I have uploaded into the Amazon S3 and linked it with the Cloudfront. I got a url something like this : dxxxxxxxx.cloudfront.net But opening that url in my browser ( I'm getting an error ) : <Error> <Code>AccessDenied</Code> <Message>Access Denied</Message> <RequestId>xxxxxx</RequestId> <HostId> xxxxxxxxxxxxxxx </HostId> </Error> But what I want actually is to use the url and add them to my webpage. How could I do that? Thanks in advance.

    Read the article

  • Cheapest High Available Web Server [closed]

    - by xyz
    I would like to create a high-available setup (e.g. a small cluster) for a webserver, i.e. it will run Apache, PHP and MySQL. There will be between 2-8 small websites running with only very little traffic and workload. High availability is however very important. I don't want to be dependent on 1 datacenter, so there must be a minimum of 2 servers placed in different datacenters, and if one server goes down, the user must experience no or only a minimum of downtime - and no data loss. I have considered Amazon AWS using their Elastic Load Balancing, since it is possible to buy 2 EC2 instances in 2 availability zones and set up load balancing and RDS (Multi-AZ). However this seems rather expensive. Using the AWS price calculator http://calculator.s3.amazonaws.com/calc5.html it totals to 185$/month the first year (including the free tier). Are my calculations incorrect or is there a cheaper way to make this HA setup? Best regards

    Read the article

1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >