Search Results

Search found 23561 results on 943 pages for 'mobile website'.

Page 271/943 | < Previous Page | 267 268 269 270 271 272 273 274 275 276 277 278  | Next Page >

  • Contact Form + jQuery validationengine

    - by BigMad
    I created this contact form, inserting jQuery fadeLabel and validationEngine to beautify the form the file index.php / .html (I have not yet figured out which of the two versions put it) scripts are index: <script src="https://ajax.googleapis.com/ajax/libs/jquery/1.7.2/jquery.min.js"></script> <script src="/js/backtop.js"></script> <script src="/js/fadeLabel.js"></script> <script> $(document).ready(function () { $('form .fadeLabel').fadeLabel(); }); </script> <script src="/js/validationEngine-it.js"></script> <script src="/js/validationEngine.js"></script> <script> $(document).ready(function(){ $("#form_box").validationEngine({ ajaxSubmit: true, ajaxSubmitFile: "contact.php", ajaxSubmitMessage: "Thank you, We will contact you soon !", success : false, failure : function() {} }) }); </script> <script src="/js/contactform.js"></script> however this is the part of the form's code <p id="form_success" class="success hide"><strong>Grazie!</strong> Il tuo messaggio è stato inviato con successo.</p> <form id="form_box"> <fieldset> <p><label for="name">Nome*</label><input type="text" id="name" name="name" class="validate[required] fadeLabel" value=""/></p> <p><label for="email">E-mail*</label><input type="email" id="email" name="email" class="validate[required,custom[email]] fadeLabel" value=""/></p> <p><label for="website">Sito web</label><input type="url" id="website" name="website" class="fadeLabel" value=""/></p> <p><label for="message">Messaggio*</label><textarea id="message" name="message" class="validate[required] fadeLabel" cols="30" rows="10" value=""></textarea></p> </fieldset> <p id="form_submit" class="submit"><button class="send">Invia</button> *Campi obbligatori</p> <p id="form_send" class="send hide">Invio in corso&hellip;</p> <p id="form_error" class="submit error hide"><button class="send">Invia</button> Si prega di correggere l'errore e re-inviarlo.</p> </form> This is the contact.php where it receives the data and sends 2 emails (one for me with the data and a thank you to those who contacted me) contact.php: <?php //include variables $my_email = "[email protected]"; $my_site = "adrianogenovese.com"; session_start(); $name = $_POST['name']; $email = $_POST['email']; $website = $_POST['website']; $message = $_POST['message']; $ip = $_SERVER['REMOTE_ADDR']; //beginning to email me $to = $my_email; $sbj = "Richiesta Informazioni - $my_site"; $msg = " <html> ... //the body of the email to me ... </html> "; $from = $email; $headers = 'MIME-Version: 1.0' . "\n"; $headers .= 'Content-type: text/html; charset=iso-8859-1' . "\n"; $headers .= "From: $from"; mail($to,$sbj,$msg,$headers); //email sent to me //beginning of the email recipient $toClient = $email; $msgClient = " <html> ... //the body of the email recipient ... </html> "; $fromClient = $my_email; $sbjClient = "Grazie $name per aver contattato $my_site "; $headersClient = 'MIME-Version: 1.0' . "\r\n"; $headersClient .= 'Content-type: text/html; charset=iso-8859-1' . "\r\n"; $headersClient .= "From: $fromClient"; mail($toClient,$sbjClient,$msgClient,$headersClient); //mail inviata al cliente //order confirmation email //Reset error session_destroy(); exit; ?> this is the contact form jscript contactform.js: $(document).ready(function() { $(".send").click(function(){ $("#form_send").removeClass('hide'); $("#form_submit").addClass('hide'); $("#form_error").addClass('hide'); var name = $("#name").val(); var email = $("#email").val(); var website = $("#website").val(); var message = $("#message").val(); if (name == "" || email == "" ) { $("#form_send").addClass('hide'); $("#form_error").removeClass('hide'); } else { $.ajax({ type: "POST", url: "contatti/contact.php", data: "name=" + name + "&email=" + email + "&message=" + message + "&website=" + website, dataType: "html", success: function(msg) { $("#form_send").addClass('hide').delay(3000).fadeOut(3000); $("#form_success").removeClass('hide'); $("#form_box").addClass('hide').slideUp(2000).fadeOut(); }, error: function() { alert("An unexpected error occurred..."); } }); } }); //end form });//end Dom The jQuery seem to work very well, I wanted to make sure that the page is not of the form updated or go to another page (the only thing that works for now) compensation reflected in the following problems: I always leave the alert of contactform.js Does not send any mail, it to me to recipient I can not do the work properly. delay () .fadeOut / fadeIn and. SlideUp (). FadeOut () so that the sending of this email appears for 3 seconds "$ (" # form_send "). addClass ('hide')" before you do anything else then the form disappears up using some second type slideUp "$ (" # form_box "). addClass ('hide')" by displaying just the "$ (" # form_success "). removeClass ('hide')" in the address bar also appears the form data (e.g. ../index.php?name=test&email=example%40mail.com&website=&message=helloworld)

    Read the article

  • Windows Azure: Import/Export Hard Drives, VM ACLs, Web Sockets, Remote Debugging, Continuous Delivery, New Relic, Billing Alerts and More

    - by ScottGu
    Two weeks ago we released a giant set of improvements to Windows Azure, as well as a significant update of the Windows Azure SDK. This morning we released another massive set of enhancements to Windows Azure.  Today’s new capabilities include: Storage: Import/Export Hard Disk Drives to your Storage Accounts HDInsight: General Availability of our Hadoop Service in the cloud Virtual Machines: New VM Gallery, ACL support for VIPs Web Sites: WebSocket and Remote Debugging Support Notification Hubs: Segmented customer push notification support with tag expressions TFS & GIT: Continuous Delivery Support for Web Sites + Cloud Services Developer Analytics: New Relic support for Web Sites + Mobile Services Service Bus: Support for partitioned queues and topics Billing: New Billing Alert Service that sends emails notifications when your bill hits a threshold you define All of these improvements are now available to use immediately (note that some features are still in preview).  Below are more details about them. Storage: Import/Export Hard Disk Drives to Windows Azure I am excited to announce the preview of our new Windows Azure Import/Export Service! The Windows Azure Import/Export Service enables you to move large amounts of on-premises data into and out of your Windows Azure Storage accounts. It does this by enabling you to securely ship hard disk drives directly to our Windows Azure data centers. Once we receive the drives we’ll automatically transfer the data to or from your Windows Azure Storage account.  This enables you to import or export massive amounts of data more quickly and cost effectively (and not be constrained by available network bandwidth). Encrypted Transport Our Import/Export service provides built-in support for BitLocker disk encryption – which enables you to securely encrypt data on the hard drives before you send it, and not have to worry about it being compromised even if the disk is lost/stolen in transit (since the content on the transported hard drives is completely encrypted and you are the only one who has the key to it).  The drive preparation tool we are shipping today makes setting up bitlocker encryption on these hard drives easy. How to Import/Export your first Hard Drive of Data You can read our Getting Started Guide to learn more about how to begin using the import/export service.  You can create import and export jobs via the Windows Azure Management Portal as well as programmatically using our Server Management APIs. It is really easy to create a new import or export job using the Windows Azure Management Portal.  Simply navigate to a Windows Azure storage account, and then click the new Import/Export tab now available within it (note: if you don’t have this tab make sure to sign-up for the Import/Export preview): Then click the “Create Import Job” or “Create Export Job” commands at the bottom of it.  This will launch a wizard that easily walks you through the steps required: For more comprehensive information about Import/Export, refer to Windows Azure Storage team blog.  You can also send questions and comments to the [email protected] email address. We think you’ll find this new service makes it much easier to move data into and out of Windows Azure, and it will dramatically cut down the network bandwidth required when working on large data migration projects.  We hope you like it. HDInsight: 100% Compatible Hadoop Service in the Cloud Last week we announced the general availability release of Windows Azure HDInsight. HDInsight is a 100% compatible Hadoop service that allows you to easily provision and manage Hadoop clusters for big data processing in Windows Azure.  This release is now live in production, backed by an enterprise SLA, supported 24x7 by Microsoft Support, and is ready to use for production scenarios. HDInsight allows you to use Apache Hadoop tools, such as Pig and Hive, to process large amounts of data in Windows Azure Blob Storage. Because data is stored in Windows Azure Blob Storage, you can choose to dynamically create Hadoop clusters only when you need them, and then shut them down when they are no longer required (since you pay only for the time the Hadoop cluster instances are running this provides a super cost effective way to use them).  You can create Hadoop clusters using either the Windows Azure Management Portal (see below) or using our PowerShell and Cross Platform Command line tools: The import/export hard drive support that came out today is a perfect companion service to use with HDInsight – the combination allows you to easily ingest, process and optionally export a limitless amount of data.  We’ve also integrated HDInsight with our Business Intelligence tools, so users can leverage familiar tools like Excel in order to analyze the output of jobs.  You can find out more about how to get started with HDInsight here. Virtual Machines: VM Gallery Enhancements Today’s update of Windows Azure brings with it a new Virtual Machine gallery that you can use to create new VMs in the cloud.  You can launch the gallery by doing New->Compute->Virtual Machine->From Gallery within the Windows Azure Management Portal: The new Virtual Machine Gallery includes some nice enhancements that make it even easier to use: Search: You can now easily search and filter images using the search box in the top-right of the dialog.  For example, simply type “SQL” and we’ll filter to show those images in the gallery that contain that substring. Category Tree-view: Each month we add more built-in VM images to the gallery.  You can continue to browse these using the “All” view within the VM Gallery – or now quickly filter them using the category tree-view on the left-hand side of the dialog.  For example, by selecting “Oracle” in the tree-view you can now quickly filter to see the official Oracle supplied images. MSDN and Supported checkboxes: With today’s update we are also introducing filters that makes it easy to filter out types of images that you may not be interested in. The first checkbox is MSDN: using this filter you can exclude any image that is not part of the Windows Azure benefits for MSDN subscribers (which have highly discounted pricing - you can learn more about the MSDN pricing here). The second checkbox is Supported: this filter will exclude any image that contains prerelease software, so you can feel confident that the software you choose to deploy is fully supported by Windows Azure and our partners. Sort options: We sort gallery images by what we think customers are most interested in, but sometimes you might want to sort using different views. So we’re providing some additional sort options, like “Newest,” to customize the image list for what suits you best. Pricing information: We now provide additional pricing information about images and options on how to cost effectively run them directly within the VM Gallery. The above improvements make it even easier to use the VM Gallery and quickly create launch and run Virtual Machines in the cloud. Virtual Machines: ACL Support for VIPs A few months ago we exposed the ability to configure Access Control Lists (ACLs) for Virtual Machines using Windows PowerShell cmdlets and our Service Management API. With today’s release, you can now configure VM ACLs using the Windows Azure Management Portal as well. You can now do this by clicking the new Manage ACL command in the Endpoints tab of a virtual machine instance: This will enable you to configure an ordered list of permit and deny rules to scope the traffic that can access your VM’s network endpoints. For example, if you were on a virtual network, you could limit RDP access to a Windows Azure virtual machine to only a few computers attached to your enterprise. Or if you weren’t on a virtual network you could alternatively limit traffic from public IPs that can access your workloads: Here is the default behaviors for ACLs in Windows Azure: By default (i.e. no rules specified), all traffic is permitted. When using only Permit rules, all other traffic is denied. When using only Deny rules, all other traffic is permitted. When there is a combination of Permit and Deny rules, all other traffic is denied. Lastly, remember that configuring endpoints does not automatically configure them within the VM if it also has firewall rules enabled at the OS level.  So if you create an endpoint using the Windows Azure Management Portal, Windows PowerShell, or REST API, be sure to also configure your guest VM firewall appropriately as well. Web Sites: Web Sockets Support With today’s release you can now use Web Sockets with Windows Azure Web Sites.  This feature enables you to easily integrate real-time communication scenarios within your web based applications, and is available at no extra charge (it even works with the free tier).  Higher level programming libraries like SignalR and socket.io are also now supported with it. You can enable Web Sockets support on a web site by navigating to the Configure tab of a Web Site, and by toggling Web Sockets support to “on”: Once Web Sockets is enabled you can start to integrate some really cool scenarios into your web applications.  Check out the new SignalR documentation hub on www.asp.net to learn more about some of the awesome scenarios you can do with it. Web Sites: Remote Debugging Support The Windows Azure SDK 2.2 we released two weeks ago introduced remote debugging support for Windows Azure Cloud Services. With today’s Windows Azure release we are extending this remote debugging support to also work with Windows Azure Web Sites. With live, remote debugging support inside of Visual Studio, you are able to have more visibility than ever before into how your code is operating live in Windows Azure. It is now super easy to attach the debugger and quickly see what is going on with your application in the cloud. Remote Debugging of a Windows Azure Web Site using VS 2013 Enabling the remote debugging of a Windows Azure Web Site using VS 2013 is really easy.  Start by opening up your web application’s project within Visual Studio. Then navigate to the “Server Explorer” tab within Visual Studio, and click on the deployed web-site you want to debug that is running within Windows Azure using the Windows Azure->Web Sites node in the Server Explorer.  Then right-click and choose the “Attach Debugger” option on it: When you do this Visual Studio will remotely attach the debugger to the Web Site running within Windows Azure.  The debugger will then stop the web site’s execution when it hits any break points that you have set within your web application’s project inside Visual Studio.  For example, below I set a breakpoint on the “ViewBag.Message” assignment statement within the HomeController of the standard ASP.NET MVC project template.  When I hit refresh on the “About” page of the web site within the browser, the breakpoint was triggered and I am now able to debug the app remotely using Visual Studio: Note above how we can debug variables (including autos/watchlist/etc), as well as use the Immediate and Command Windows. In the debug session above I used the Immediate Window to explore some of the request object state, as well as to dynamically change the ViewBag.Message property.  When we click the the “Continue” button (or press F5) the app will continue execution and the Web Site will render the content back to the browser.  This makes it super easy to debug web apps remotely. Tips for Better Debugging To get the best experience while debugging, we recommend publishing your site using the Debug configuration within Visual Studio’s Web Publish dialog. This will ensure that debug symbol information is uploaded to the Web Site which will enable a richer debug experience within Visual Studio.  You can find this option on the Web Publish dialog on the Settings tab: When you ultimately deploy/run the application in production we recommend using the “Release” configuration setting – the release configuration is memory optimized and will provide the best production performance.  To learn more about diagnosing and debugging Windows Azure Web Sites read our new Troubleshooting Windows Azure Web Sites in Visual Studio guide. Notification Hubs: Segmented Push Notification support with tag expressions In August we announced the General Availability of Windows Azure Notification Hubs - a powerful Mobile Push Notifications service that makes it easy to send high volume push notifications with low latency from any mobile app back-end.  Notification hubs can be used with any mobile app back-end (including ones built using our Mobile Services capability) and can also be used with back-ends that run in the cloud as well as on-premises. Beginning with the initial release, Notification Hubs allowed developers to send personalized push notifications to both individual users as well as groups of users by interest, by associating their devices with tags representing the logical target of the notification. For example, by registering all devices of customers interested in a favorite MLB team with a corresponding tag, it is possible to broadcast one message to millions of Boston Red Sox fans and another message to millions of St. Louis Cardinals fans with a single API call respectively. New support for using tag expressions to enable advanced customer segmentation With today’s release we are adding support for even more advanced customer targeting.  You can now identify customers that you want to send push notifications to by defining rich tag expressions. With tag expressions, you can now not only broadcast notifications to Boston Red Sox fans, but take that segmenting a step farther and reach more granular segments. This opens up a variety of scenarios, for example: Offers based on multiple preferences—e.g. send a game day vegetarian special to users tagged as both a Boston Red Sox fan AND a vegetarian Push content to multiple segments in a single message—e.g. rain delay information only to users who are tagged as either a Boston Red Sox fan OR a St. Louis Cardinal fan Avoid presenting subsets of a segment with irrelevant content—e.g. season ticket availability reminder to users who are tagged as a Boston Red Sox fan but NOT also a season ticket holder To illustrate with code, consider a restaurant chain app that sends an offer related to a Red Sox vs Cardinals game for users in Boston. Devices can be tagged by your app with location tags (e.g. “Loc:Boston”) and interest tags (e.g. “Follows:RedSox”, “Follows:Cardinals”), and then a notification can be sent by your back-end to “(Follows:RedSox || Follows:Cardinals) && Loc:Boston” in order to deliver an offer to all devices in Boston that follow either the RedSox or the Cardinals. This can be done directly in your server backend send logic using the code below: var notification = new WindowsNotification(messagePayload); hub.SendNotificationAsync(notification, "(Follows:RedSox || Follows:Cardinals) && Loc:Boston"); In your expressions you can use all Boolean operators: AND (&&), OR (||), and NOT (!).  Some other cool use cases for tag expressions that are now supported include: Social: To “all my group except me” - group:id && !user:id Events: Touchdown event is sent to everybody following either team or any of the players involved in the action: Followteam:A || Followteam:B || followplayer:1 || followplayer:2 … Hours: Send notifications at specific times. E.g. Tag devices with time zone and when it is 12pm in Seattle send to: GMT8 && follows:thaifood Versions and platforms: Send a reminder to people still using your first version for Android - version:1.0 && platform:Android For help on getting started with Notification Hubs, visit the Notification Hub documentation center.  Then download the latest NuGet package (or use the Notification Hubs REST APIs directly) to start sending push notifications using tag expressions.  They are really powerful and enable a bunch of great new scenarios. TFS & GIT: Continuous Delivery Support for Web Sites + Cloud Services With today’s Windows Azure release we are making it really easy to enable continuous delivery support with Windows Azure and Team Foundation Services.  Team Foundation Services is a cloud based offering from Microsoft that provides integrated source control (with both TFS and Git support), build server, test execution, collaboration tools, and agile planning support.  It makes it really easy to setup a team project (complete with automated builds and test runners) in the cloud, and it has really rich integration with Visual Studio. With today’s Windows Azure release it is now really easy to enable continuous delivery support with both TFS and Git based repositories hosted using Team Foundation Services.  This enables a workflow where when code is checked in, built successfully on an automated build server, and all tests pass on it – I can automatically have the app deployed on Windows Azure with zero manual intervention or work required. The below screen-shots demonstrate how to quickly setup a continuous delivery workflow to Windows Azure with a Git-based ASP.NET MVC project hosted using Team Foundation Services. Enabling Continuous Delivery to Windows Azure with Team Foundation Services The project I’m going to enable continuous delivery with is a simple ASP.NET MVC project whose source code I’m hosting using Team Foundation Services.  I did this by creating a “SimpleContinuousDeploymentTest” repository there using Git – and then used the new built-in Git tooling support within Visual Studio 2013 to push the source code to it.  Below is a screen-shot of the Git repository hosted within Team Foundation Services: I can access the repository within Visual Studio 2013 and easily make commits with it (as well as branch, merge and do other tasks).  Using VS 2013 I can also setup automated builds to take place in the cloud using Team Foundation Services every time someone checks in code to the repository: The cool thing about this is that I don’t have to buy or rent my own build server – Team Foundation Services automatically maintains its own build server farm and can automatically queue up a build for me (for free) every time someone checks in code using the above settings.  This build server (and automated testing) support now works with both TFS and Git based source control repositories. Connecting a Team Foundation Services project to Windows Azure Once I have a source repository hosted in Team Foundation Services with Automated Builds and Testing set up, I can then go even further and set it up so that it will be automatically deployed to Windows Azure when a source code commit is made to the repository (assuming the Build + Tests pass).  Enabling this is now really easy.  To set this up with a Windows Azure Web Site simply use the New->Compute->Web Site->Custom Create command inside the Windows Azure Management Portal.  This will create a dialog like below.  I gave the web site a name and then made sure the “Publish from source control” checkbox was selected: When we click next we’ll be prompted for the location of the source repository.  We’ll select “Team Foundation Services”: Once we do this we’ll be prompted for our Team Foundation Services account that our source repository is hosted under (in this case my TFS account is “scottguthrie”): When we click the “Authorize Now” button we’ll be prompted to give Windows Azure permissions to connect to the Team Foundation Services account.  Once we do this we’ll be prompted to pick the source repository we want to connect to.  Starting with today’s Windows Azure release you can now connect to both TFS and Git based source repositories.  This new support allows me to connect to the “SimpleContinuousDeploymentTest” respository we created earlier: Clicking the finish button will then create the Web Site with the continuous delivery hooks setup with Team Foundation Services.  Now every time someone pushes source control to the repository in Team Foundation Services, it will kick off an automated build, run all of the unit tests in the solution , and if they pass the app will be automatically deployed to our Web Site in Windows Azure.  You can monitor the history and status of these automated deployments using the Deployments tab within the Web Site: This enables a really slick continuous delivery workflow, and enables you to build and deploy apps in a really nice way. Developer Analytics: New Relic support for Web Sites + Mobile Services With today’s Windows Azure release we are making it really easy to enable Developer Analytics and Monitoring support with both Windows Azure Web Site and Windows Azure Mobile Services.  We are partnering with New Relic, who provide a great dev analytics and app performance monitoring offering, to enable this - and we have updated the Windows Azure Management Portal to make it really easy to configure. Enabling New Relic with a Windows Azure Web Site Enabling New Relic support with a Windows Azure Web Site is now really easy.  Simply navigate to the Configure tab of a Web Site and scroll down to the “developer analytics” section that is now within it: Clicking the “add-on” button will display some additional UI.  If you don’t already have a New Relic subscription, you can click the “view windows azure store” button to obtain a subscription (note: New Relic has a perpetually free tier so you can enable it even without paying anything): Clicking the “view windows azure store” button will launch the integrated Windows Azure Store experience we have within the Windows Azure Management Portal.  You can use this to browse from a variety of great add-on services – including New Relic: Select “New Relic” within the dialog above, then click the next button, and you’ll be able to choose which type of New Relic subscription you wish to purchase.  For this demo we’ll simply select the “Free Standard Version” – which does not cost anything and can be used forever:  Once we’ve signed-up for our New Relic subscription and added it to our Windows Azure account, we can go back to the Web Site’s configuration tab and choose to use the New Relic add-on with our Windows Azure Web Site.  We can do this by simply selecting it from the “add-on” dropdown (it is automatically populated within it once we have a New Relic subscription in our account): Clicking the “Save” button will then cause the Windows Azure Management Portal to automatically populate all of the needed New Relic configuration settings to our Web Site: Deploying the New Relic Agent as part of a Web Site The final step to enable developer analytics using New Relic is to add the New Relic runtime agent to our web app.  We can do this within Visual Studio by right-clicking on our web project and selecting the “Manage NuGet Packages” context menu: This will bring up the NuGet package manager.  You can search for “New Relic” within it to find the New Relic agent.  Note that there is both a 32-bit and 64-bit edition of it – make sure to install the version that matches how your Web Site is running within Windows Azure (note: you can configure your Web Site to run in either 32-bit or 64-bit mode using the Web Site’s “Configuration” tab within the Windows Azure Management Portal): Once we install the NuGet package we are all set to go.  We’ll simply re-publish the web site again to Windows Azure and New Relic will now automatically start monitoring the application Monitoring a Web Site using New Relic Now that the application has developer analytics support with New Relic enabled, we can launch the New Relic monitoring portal to start monitoring the health of it.  We can do this by clicking on the “Add Ons” tab in the left-hand side of the Windows Azure Management Portal.  Then select the New Relic add-on we signed-up for within it.  The Windows Azure Management Portal will provide some default information about the add-on when we do this.  Clicking the “Manage” button in the tray at the bottom will launch a new browser tab and single-sign us into the New Relic monitoring portal associated with our account: When we do this a new browser tab will launch with the New Relic admin tool loaded within it: We can now see insights into how our app is performing – without having to have written a single line of monitoring code.  The New Relic service provides a ton of great built-in monitoring features allowing us to quickly see: Performance times (including browser rendering speed) for the overall site and individual pages.  You can optionally set alert thresholds to trigger if the speed does not meet a threshold you specify. Information about where in the world your customers are hitting the site from (and how performance varies by region) Details on the latency performance of external services your web apps are using (for example: SQL, Storage, Twitter, etc) Error information including call stack details for exceptions that have occurred at runtime SQL Server profiling information – including which queries executed against your database and what their performance was And a whole bunch more… The cool thing about New Relic is that you don’t need to write monitoring code within your application to get all of the above reports (plus a lot more).  The New Relic agent automatically enables the CLR profiler within applications and automatically captures the information necessary to identify these.  This makes it super easy to get started and immediately have a rich developer analytics view for your solutions with very little effort. If you haven’t tried New Relic out yet with Windows Azure I recommend you do so – I think you’ll find it helps you build even better cloud applications.  Following the above steps will help you get started and deliver you a really good application monitoring solution in only minutes. Service Bus: Support for partitioned queues and topics With today’s release, we are enabling support within Service Bus for partitioned queues and topics. Enabling partitioning enables you to achieve a higher message throughput and better availability from your queues and topics. Higher message throughput is achieved by implementing multiple message brokers for each partitioned queue and topic.  The  multiple messaging stores will also provide higher availability. You can create a partitioned queue or topic by simply checking the Enable Partitioning option in the custom create wizard for a Queue or Topic: Read this article to learn more about partitioned queues and topics and how to take advantage of them today. Billing: New Billing Alert Service Today’s Windows Azure update enables a new Billing Alert Service Preview that enables you to get proactive email notifications when your Windows Azure bill goes above a certain monetary threshold that you configure.  This makes it easier to manage your bill and avoid potential surprises at the end of the month. With the Billing Alert Service Preview, you can now create email alerts to monitor and manage your monetary credits or your current bill total.  To set up an alert first sign-up for the free Billing Alert Service Preview.  Then visit the account management page, click on a subscription you have setup, and then navigate to the new Alerts tab that is available: The alerts tab allows you to setup email alerts that will be sent automatically once a certain threshold is hit.  For example, by clicking the “add alert” button above I can setup a rule to send myself email anytime my Windows Azure bill goes above $100 for the month: The Billing Alert Service will evolve to support additional aspects of your bill as well as support multiple forms of alerts such as SMS.  Try out the new Billing Alert Service Preview today and give us feedback. Summary Today’s Windows Azure release enables a ton of great new scenarios, and makes building applications hosted in the cloud even easier. If you don’t already have a Windows Azure account, you can sign-up for a free trial and start using all of the above features today.  Then visit the Windows Azure Developer Center to learn more about how to build apps with it. Hope this helps, Scott P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

    Read the article

  • .co.uk targeted for google.co.uk .com targeted for google.com

    - by Higgs Boson
    We've had a website running on a .co.uk domain for some years, this domain is listed in the SERPS for our brand on both google.co.uk and google.com. We get little traffic from anywhere other than the UK because the website is targeted at the UK market with specific UK keywords. This is great, however we recently purchased the .com domain with the intention of producing a second version of the website targeted to the United States with US specific keywords i.e. targeting and moving in to the US marketplace. We have used Google webmaster tools to set the geographic target for the .com domain to be the US. I think I was expecting ONLY the .com site to show up when searching google.com and only the .co.uk site to show up when searching google.co.uk. However when we search google.com for our 'brand' the .co.uk site is listed in the SERPS. We would prefer the .com to appear in the SERPS on google.com. Is there anything we can do?

    Read the article

  • Backup options in SharePoint 2007

    - by sreejukg
    It is very important to make sure the server farm backup is taking properly, making sure that in case of any disaster, the administrator has the latest backup that can be used to restore. This articles addresses some of the options available for backup/restore in SharePoint 2007 Backup There are two options that can be utilized to take backup of SharePoint sites. Using SharePoint Central Administration website Using SharePoint central administration website, you can do backup/restore from user interface. Using central administration website you can back up the following · Server farm · Web application · Content databases Follow these steps to take backup of the server farm using central administration 1. Open Central administration website 2. Navigate to Operations -> Backup and Restore -> Perform a backup 3. Here you will have options to choose the item to back up. Select Farm (the top most item in the list) 4. Once you select the items to backup, click on “Continue to backup options” 5. Select “Full” as type of backup. 6. In the backup file location, enter the path where you need to store the backup. The path should be according to the UNC, for e.g. for c drive you may use \\server\c$\mybackupFolder 7. Click ok 8. Now you will be redirected to Backup and Restore Status page. This page shows the progress for the backup operation. You can use the refresh button to update the status of backup(this page will automatically refresh in every 30 seconds). Once completed you can find the files in the specified folder. Using STSADM website SharePoint comes with a STSADM command line tool. STSADM provides lot of administrative operations that can be performed on SharePoint 2007 sites. You can find STSADM command from the following location C:\Program Files\Common Files\Microsoft shared\web server extensions\12\bin (You may change the drive letter according to your installation) STSADM provides a method for performing the Office SharePoint Server 2007 administration tasks at the command line or by using batch files or scripts. STSADM provides access to operations not available by using the Central Administration site The general syntax for STSADM is as follows STSADM -operation Operation Name –parameter1 value1 –parameter2 value2 ……….. Using STSADM you can back up the following · Server farm · Web application · Content databases To perform any STSADM, operation you need to be a member of administrators group. Follow these steps to take backup of SharePoint server farm using STSADM tool. Note: make sure you are logged in to the computer where central administration website is installed. 1. Open the Command prompt (You should run command prompt with administrator privileges) 2. Change the working directory to C:\Program Files\Common Files\Microsoft shared\web server extensions\12\bin 3. Enter the command, then press enter Stsadm –o backup -directory <UNC path> -backupmethod full 4. You will get success / failure message once the command finishes. How to schedule the backup There is no option to schedule a backup using central administration site. Also there is no operation provided by STSADM to automate the backup. The farm administrators need to take backup in regular intervals. To achieve this, you can write a batch file that includes STSADM command to take full backup of the server. This batch file can be scheduled using windows task scheduler to execute in certain intervals. Sample of the batch file 1. Open notepad(or any other text editor) 2. Enter the following commands @echo off echo =============================================================== echo Back up the farm to <C:\backup> echo =============================================================== cd %COMMONPROGRAMFILES%\Microsoft Shared\web server extensions\12\BIN @echo off stsadm.exe -o backup -directory "<\backup>" -backupmethod full echo completed 3. Save the file with .bat extension You can schedule this batch file as you require. Other Options Using STSADM tool, you will be able to take backup for individual site collection. The syntax for this is stsadm -o backup -url <URL name for site collection> -filename <file name> [-overwrite] The explanations for the parameters are as follows. -url The url of the site collection you need to backup -filename The name of the backup file. E.g. c:\backup.bak -overwrite optional. Indicates if the filename specified exists, whether to overwrite or not. If you are creating the batch file for scheduling the backup for a site collection, you may need to specify the backup filename automatically created. It is an option that you can generate the filename with date so that you can keep backup for each day. e.g. The following commands can be utilized create a site collection backup. @echo off echo =============================================================== echo Back up the farm to <C:\backup> echo =============================================================== echo =============================================================== echo getting todays date to a variable echo =============================================================== @For /F "tokens=1,2,3 delims=/ " %%A in (‘Date /t’) do @( Set Day=%%A Set Month=%%B Set Year=%%C Set todayDate=%%C%%B%%A ) cd %COMMONPROGRAMFILES%\Microsoft Shared\web server extensions\12\BIN @echo off stsadm -o backup -url <sitecollection url> -filename \\ServerName\ShareName\Backup_%todayDate%.bak -overwrite echo completed To read more about backup STSADM operation, read this http://technet.microsoft.com/en-us/library/cc263441.aspx

    Read the article

  • 7 Web Design Tutorials from PSD to HTML/CSS

    - by Sushaantu
    Some time back when I was looking for some tutorials to create a website from scratch i.e. the process from designing the PSD to slice it and CSS/XHTML it, then not many quality results appeared. But that was like almost an year back and a lot of water has flown down the river Thanes since then. In this list I will give you links to some wonderful tutorials teaching you in a step by step way to design a website. These tutorials are ideal for someone who is learning web designing and has grasp of basic CSS, XHTML and little designing on Photoshop. How to Design and Code Web 2.0 Style Web Design Design a website from PSD to HTML Designing and Coding a Grunge Web Design from Scratch Creating a CSS layout from scratch Build a Sleek Portfolio Site from Scratch Designing and Coding a web design from scratch Design and Code a Dark and Sleek Web Design

    Read the article

  • Google Webmasters tools search queries position

    - by user1592845
    In my website account on Google Webmasters tools, some search queries show average position 1.0. This make me understand that it should be displayed as the first result. When I search for this query I could not able to find my website's page listed as a result?! In some cases I navigate to the third or the fourth result page and I could not find it! What are factors that make my website loss its average position for a search query? and when Google webmasters tools updates their values?

    Read the article

  • Security issue about making my code public in GitHub

    - by John Doe
    I'm developing a big community/forum website and I'd like to upload my code to GitHub to have at least some sort of version control over it (because I have nothing other than a .rar file as a backup, not even SVN), to let others contribute to the project, and also perhaps using it to let my potential future employers see some of my code as some sort of curriculum. But what I'm wondering now, and I'm suprised I haven't seen anyone mention it before is the security aspect of it. Isn't publishing the code of a website a HUGE security hole? Is like giving a potential hacker or anyone who would like to find any potential exploit possible, even considering that the critical files aren't uploaded (database passwords, authentication scripts, etc.). Of course that there are millions of projects uploaded to GitHub and no one will find mine just 'by chance'. But if they look for it, it would indeed be there. Bottomline: my problem is not about copyright or licenses, but others finding exploits in my website. I'm I missing something here?

    Read the article

  • advertising servers / advert delivery solutions for C#/Asp.Net

    - by Karl Cassar
    We have a website which we want to show adverts in - However, these are custom adverts uploaded by the webmaster, not the Google adverts, or any adverts the network chooses. Ideally, there would be both options. We were considering developing our own advert-management system, but looking at the big picture, it might be better to consider other alternatives. Website is currently developed in C# / ASP.Net (Web Forms) Are there any recommendations to some open-source delivery networks and/or external hosted advert delivery networks? Personally I've used Google's DFP, however sometimes it is not so easy to get a Google AdSense account approved, especially while developing a new website and it not yet being launched. Not sure if this is the best place to ask this kind of question!

    Read the article

  • Legal responsibility for emebedding code

    - by Tom Gullen
    On our website we have an HTML5 arcade. For each game it has an embed this game on your website copy + paste code box. We've done the approval process of games as strictly and safely as possible, we don't actually think it is possible to have any malicious code in the games. However, we are aware that there's a bunch of people out there smarter than us and they might be able to exploit it. For webmasters wanting to copy + paste our games on their websites, we want to warn them that they are doing it at their own risk - but could we be held responsible if say for instance a malicious game was hosted on an important website and it stole their users credentials and cause them damage? I'm wondering if having an HTML comment in the copy + paste code saying "Use at your own risk" is sufficient.

    Read the article

  • Logic to create common Serverlet3 Login

    - by user3696143
    I am using Servlet3 Login to Authenticate User in website I have these Login Website Normal Login(Fill the Sigup form) Facebook Login (From Facebook Id) Twitter Login (From Twitter) And I am already authenticate user by below code HttpServletRequest request = (HttpServletRequest) FacesContext.getCurrentInstance().getExternalContext().getRequest(); request.login(username, password); And it is working fine for Website Login as user gave his/her EMailId and password and it store in DB. Now I modified table and added more columns to save Facebookid in same user table and also password for Facebook login FacebookId work as a Password as well. Same I will do for Twitter But I want the same Servlet3 to authenticate user. How can I achieve it? And also added context.xml file inside META-INF folder <Realm localDataSource="true" debug="99" className="org.apache.catalina.realm.JDBCRealm" connectionName="user" connectionPassword="password" connectionURL="jdbc:mysql://localhost:3306/ ccc" digest="md5" driverName="com.mysql.jdbc.Driver" roleNameCol="role_name" userCredCol="password" userNameCol="email_id" userRoleTable="users_list" userTable="user_list_view" /> Also it is possible to check which query fired by realm entry?

    Read the article

  • Where's Randall Hyde?

    - by user1124893
    This probably doesn't belong here, but I couldn't think of any other StackExchange site that would fit it. Quick question, what ever happened to Randall Hyde, author of The Art of Assembly, HLA, and other works? I ask this because I was just exploring some of the content on his website and a lot of it is now gone. His website was hosted on Apple's MobileMe. As of the writing of this question, Apple has closed off all MobileMe content a few days ago. Correct me if I'm wrong, but didn't Apple warn of this a year in advance? If so, then where's Randall Hyde? Come to think of it, all of the content on his website that I have seen is several years old. A lot of it is rather useful but unfinished.

    Read the article

  • Using subdomains or directories for main categories?

    - by Matthieu
    I have a website which references places to travel to in the world. Those places are (of course) grouped by countries. Here is an example of an actual URL of my website: http://awespot.org/country/105/iceland I am wondering if it would be better, in terms of SEO, to have the countries separated in subdomains: http://iceland.awespot.org/ I know subdomains are considered by Google as different websites, so I am considering the 2 options: separating would mean also separating the pagerank and benefits of links to the website but separating would enable me to create a "web" or related websites (all related to travel and all) that link and benefit to each other I am only asking about SEO here, I know there are other questions raised by this possibility (user experience, administration... even password completion by web browsers)

    Read the article

  • Web developer has become uncooperative, what should we do to rescue our site? [closed]

    - by TOM
    What can an individual or a company do if the web designer who has designed their website becomes completely uncooperative? In our case He refuses to meet with us for discussions. He refuses to give us training on the effective use and management of the website he has designed for us (and has been paid in full for) despite this training being part of the original contract. Last year he disappeared for a number of months ,refused to answer emails or phobe calls and was totally unavailable to help us He refuses to give us the details of the hosting of our website. We have totally lost faith in this arrogant and unreasonable guy and would like to break off all relationships with him but ,it appears, he's got us over a barell

    Read the article

  • adverising servers / advert delivery solutions for C#/Asp.Net

    - by Karl Cassar
    We have a website which we want to show adverts in - However, these are custom adverts uploaded by the webmaster, not the Google adverts, or any adverts the network chooses. Ideally, there would be both options. We were considering developing our own advert-management system, but looking at the big picture, it might be better to consider other alternatives. Website is currently developed in C# / ASP.Net (Web Forms) Are there any recommendations to some open-source delivery networks and/or external hosted advert delivery networks? Personally I've used Google's DFP, however sometimes it is not so easy to get a Google AdSense account approved, especially while developing a new website and it not yet being launched. Not sure if this is the best place to ask this kind of question!

    Read the article

  • What tools to use for efficient link building?

    - by Evgeny
    As most SEO experts keep saying, it is not just the content that you have - but also a hefty amount of quality incoming links to your content that is important -- these are the two ways to get to the top of the search results. The question is where do I find the incomnig links? One way I know is Google Blog Search, it can be used to find blogs with related information to your content and some allow to leave comments. The comments usually consist of your name, e-mail and website. If you put your keyword instead of your name, then the keyword turns into a link to your website. Unfortunately most blogs put rel=nofollow on such links, but some blogs don't do that. What other ways are there to find quality pages to put keywords links back to your website? Quality link usually means: located on a page with relevant content does not have a rel=nofollow in the <a has a relevant keyword as in <a href=websitekeyword</a the page with the link has high PageRank (3+) and TrustRank

    Read the article

  • php templating with codeigniter

    - by JaPerk14
    I am currently develop a website application in codeigniter, and I'd like to do something in PHP / CodeIgniter where I can make a common template for separate sections of the website. I was thinking that I would keep the header / footer in a separate php files & include them separately. The thing I'm not sure about is the content beneath the header and above the footer. This website application will contain a lot of different pages, so I'm having a hard time figuring how what's the best way to do this.

    Read the article

  • Sharing banner on 3rd party websites, concerned about limited resources

    - by Omne
    I've made a banner for my website and I'm planning to ask my followers to share it on their website to help improve my rank. my website is hosted on GAE, the banners are less than 5kb/each and I must say that I don't want to pay for extra bandwidth I've read the Google App Engine Quotas but honestly I don't understand anything of it. Would you please tell me which table/data in this page should be of my concern? Also, do you think it's wise to host such banners, that are going to end up on 3rd party websites, on the GAE? or am I more secure if I use free online services like Google Picasa?

    Read the article

  • How to pay for servers without ads ?

    - by Matthieu
    Hi everyone ! I have a website whose goal is to provide a free educational service (like wikipedia does, not on the same scale obviously). I wonder how I will continue to pay for the servers when I reach high traffic. I don't want to use any ads or sponsor things, because this is not the goal of the website (just like wikipedia once again). Does "please donate" works ? If no, is there any alternative to a "private/premium not free" part in the website ? Thanks

    Read the article

  • Why can't GoogleBot forget a 404 File Does Not Exist?

    - by Sam
    Hi folks, For two months now, googlebot is trying to get a file which does not exist anymore. This is just one example out of many: I had renamed the file to a better name and removed the old file. Now, why does Google insist on getting a file which it already has seen does not exist for months? doesn't it just give up and get on being a happy bot? My Error log file is filled with these repeating lines all wanting to get that one file, although they know for already long time that its not there: What to do in these situations? Are there automatic redirection rules that handle via 301 to the home page? [Sat Mar 05 01:55:41 2011] [error] [client 66.249.66.177] File does not exist: /var/www/vhosts/website.org/httpdocs/extraNeus.php [Sat Mar 05 01:58:20 2011] [error] [client 66.249.66.177] File does not exist: /var/www/vhosts/website.org/httpdocs/extraNeus.php [Sat Mar 05 02:03:57 2011] [error] [client 66.249.66.177] File does not exist: /var/www/vhosts/website.org/httpdocs/extraNeus.php on and on ... and on...

    Read the article

  • Sharing banners on 3rd party websites, concerned about limited resources on on server side

    - by Omne
    I've made a banner for my website and I'm planning to ask my followers to share it on their website to help improve my rank. my website is hosted on GAE, the banners are less than 5kb/each and I must say that I don't want to pay for extra bandwidth I've read the Google App Engine Quotas but honestly I don't understand anything of it. Would you please tell me which table/data in this page should be of my concern? Also, do you think it's wise to host such banners, that are going to end up on 3rd party websites, on the GAE? or am I more secure if I use free online services like Google Picasa?

    Read the article

  • Why my site not linking with google.com?

    - by nishant
    i am very tired about my website ranking in google. i am dong hard work about it but not getting anywhere my site in google. actually i am web master of a www.panbeli.in matrimony website INDIA. i am trying to improve its visibility in google last 5 month but not getting any positive result. but other search engine giving good result like yahoo and bing but google showing no any result in top 20 page result. my website is www.panbeli.in and my keyword are- bari samaj bari matrimony bari community bari shadi panbeli please help me if you can, b'coz i am very frustrated about it. my domain age is 4 years when i type link:panbeli.in in google search does not appear any pages from my site in google. whats the meaning is that?my site does not indexed in google?

    Read the article

  • How to evaluate SEO/prominence improvement [on hold]

    - by Rober
    I will work on a website SEO and before starting with it I would like to "take a snapshot" of the present status so that I will be able to compare it with the new situation in a few months and evaluate my work and the real improvement. I don't mean whether the website is well implemented or not, but how well it is seen by Google and others. What prominence it has. I am taking some variables from Google Analytics (average day visits...), from Google Webmaster Tools (Search traffic and average position...) and some other indicators, like automatic SEO audit figures (website estimated worth, real pagerank...). What would you look at before starting SEO improvement?

    Read the article

  • Issues with web hosting at home

    - by hari
    I want to host a small personal website at home. One basic problem I am hitting is, From inside home network, I cannot access my domain name. I have to use the local ip (something like 192.168.1.4) to access the website. This ip is the desktop which is hosting the website. Because of this mapping, I have issues setting up a simple wordpress blog on it too. How do I get past this issue? edit:0 when I try to access www.example.com (my domain) from within my home network, I get redirected to my router login. PS: 1) I am using dyndns service to map my non-static ip to my domain name. 2) My portforwarding works fine.

    Read the article

  • Does Webmaster Tools list traffic from ads as inbound links?

    - by Mohamad
    In Webmaster Tools, under the inbound links section, do ads get counted as inbound links? I am doing a review of inbound links on a website and found that most of them are sourced from meaningless blogs and spam websites. Before I accuse anyone of not doing their job properly, I would like to know something: Is it possible that those inbound links were generated when an ad for the website appeared on the spam website? An SEO firm was paid handsomly to generate inbound links and I am afraid all they did was submit material to spam blogs and websites.

    Read the article

  • Drive Online Engagement with Intuitive Portals and Websites

    - by kellsey.ruppel
    As more and more business is being conducted via online channels, engaging users and making them more productive and efficient though these online channels is becoming critical. These users could be customers, partners or employees and while the respective channels through which they interact might be different, these users do increasingly interact with your business through the Web, or mobile devices or now through various social mediums.  Businesses need a user engagement strategy and solution that allows them to deliver targeted and personalized content and applications to users through the various online mediums and touch points.  The customer experience today is made up of an ongoing set of interactions with organizations across many channels, online and offline.  The Direct channel (including sales reps, email and mail) is an important point of contact, as is the Contact Center.  Contact Centers rely on the phone as a means of interacting with customers, and also more now than ever, the Web as well.  However, the online organization is often managed separately from the Contact Center organization within a business. In-store is an important channel for retailers, offering Point-of-Service for human interactions, and Kiosks which enable self-service. Kiosks are a Web-enabled touch point but in-store kiosks are often managed by the head of retail operations, rather than the online organization.  And of course, the online channel, including customer interactions with an organization via digital means -- on the website, mobile websites, and social networking sites, has risen to paramount importance in recent years in the customer experience. Historically all of these channels have been managed separately. The result of all of this fragmentation is that the customer touch points with an organization are siloed.  Their interactions online are not known and respected in their dealings in-store.  Their calls to the contact center are not taken as input into what the website offers them when they arrive. Think of how many times you’ve fallen victim to this. Your experience with the company call center is different than the experience in-store. Your experience with the company website on your desktop computer is different than your experience on your iPad. I think you get the point. But the customer isn’t the only one we need to look at here, as employees and the IT organization have challenges as well when it comes to online engagement. There are many common tools and technologies that organizations have been using to try and engage users, whether it’s customers, employees or partners. Some have adopted different blog and wiki technologies (some hosted, some open source, sometimes embedded in platforms), to things like tagging, file sharing and content management, or composite applications for self-service applications and activity streams. Basically, there are so many different tools & technologies that each address different aspects of user engagement. Now, one of the challenges with this, is that if we look at each individual tool, typically just implementing for example a file sharing and basic collaboration solution, may meet the needs of the business user for one aspect of user engagement, but it may not be the best solution to engage with customers and partners, or it may not fit with IT standards such as integrating with their single sign on tools or their corporate website. Often, the scenario is that businesses are having to acquire multiple pieces and parts as well as build custom applications to meet their needs. Leaving customers and partners with a more fragmented way of interacting with the company. Every organization has some sort of enterprise balancing act between the needs of the business user and the needs and restrictions enforced by enterprise IT groups. As we’ve been discussing, we all know that the expectations for online engagement have changed since the days of the static, one-size fits all website. With these changes have come some very difficult organizational challenges as well. Today, as a business user, you want to engage with your customers, and your customers expect you to know who they are. They expect you to recall the details they’ve provided to you on your website, to your CSRs and to your sales people. They expect you to remember their purchases, their preferences and their problems. And they expect you to know who they are, equally well, across channels, including your web presence. This creates a host of challenges for today’s business users. Delivering targeted, relevant content online is now essential for converting prospects into customers and for engendering long term loyalty. Business users need the ability to leverage customer data from different sources to fuel their segmentation and targeting strategies and to easily set-up, manage and optimize online campaigns. Also critical, they need the ability to accomplish these things on-the-fly, at the speed of the marketplace, while making iterative improvements.  These changing expectations put a host of demands on the IT organization as well. The web presence must be able to scale to support the delivery of personalized and targeted content to thousands of site visitors without sacrificing performance. And integration between systems becomes more important as well, as organizations strive to obtain one view of the customer culled from WCM data, CRM data and more. So then, how do you solve these challenges and meet the growing demands of your users?  You need a solution that: Unifies every customer interaction across all channels Personalizes the products and content that interest the customer and to the device Delivers targeted promotions to the right customer Engages and improve employee productivity Provides self-service access to applications Includes embedded in-context social   So how then do you achieve this level of online engagement, complete customer experience and engage your employees? The answer: Oracle WebCenter. If you want to learn how to get there, we encourage you to attend this webcast on Thursday Drive Online Engagement with Intuitive Portals and Websites, where we'll talk about how you are able to transform your portal experience and optimize online engagement -- making your portals more interactive and more engaging across multiple channels. Register today!

    Read the article

< Previous Page | 267 268 269 270 271 272 273 274 275 276 277 278  | Next Page >