Search Results

Search found 53516 results on 2141 pages for 'web analytics tools'.

Page 70/2141 | < Previous Page | 66 67 68 69 70 71 72 73 74 75 76 77  | Next Page >

  • Tor and Google Analytics - how to track?

    - by Jeremy French
    I make a lot of use of Google Analytics - Google has reasonable tracking for location of users so I can tell where users come from. I know it is not 100% but it gives an idea. In the wake of Prism it is possible that more people will make use of networks such as tor for anonymous browsing. I have no problem with this, people can wear tin foil hats while browsing my site for all I care, but it will lead to more erroneous stats. Is there any way to flag traffic as coming from TOR, so I can filter location reports not to include it, and to get an idea of the percentage of traffic which does use it? Has anyone actually tried this?

    Read the article

  • Application Composer: Exposing Your Customizations in BI Analytics and Reporting

    - by Richard Bingham
    Introduction This article explains in simple terms how to ensure the customizations and extensions you have made to your Fusion Applications are available for use in reporting and analytics. It also includes four embedded demo videos from our YouTube channel (if they don't appear check the browser address bar for a blocking shield icon). If you are new to Business Intelligence consider first reviewing our getting started article, and you can read more about the topic of custom subject areas in the documentation book Extending Sales. There are essentially four sections to this post. First we look at how custom fields added to standard objects are made available for reporting. Secondly we look at creating custom subject areas on the standard objects. Next we consider reporting on custom objects, starting with simple standalone objects, then child custom objects, and finally custom objects with relationships. Finally this article reviews how flexfields are exposed for reporting. Whilst this article applies to both Cloud/SaaS and on-premises deployments, if you are an on-premises developer then you can also use the BI Administration Tool to customize your BI metadata repository (the RPD) and create new subject areas. Whilst this is not covered here you can read more in Chapter 8 of the Extensibility Guide for Developers. Custom Fields on Standard Objects If you add a custom field to your standard object then it's likely you'll want to include it in your reports. This is very simple, since all new fields are instantly available in the "[objectName] Extension" folder in existing subject areas. The following two minute video demonstrates this. Custom Subject Areas for Standard Objects You can create your own subject areas for use in analytics and reporting via Application Composer. An example use-case could be to simplify the seeded subject areas, since they sometimes contain complex data fields and internal values that could confuse business users. One thing to note is that you cannot create subject areas in a sandbox, as it is not supported by BI, so once your custom object is tested and complete you'll need to publish the sandbox before moving forwards. The subject area creation processes is essentially two-fold. Once the request is submitted the ADF artifacts are generated, then secondly the related metadata is sent to the BI presentation server API's to make the updates there. One thing to note is that this second step may take up to ten minutes to complete. Once finished the status of the custom subject area request should show as 'OK' and it is then ready for use. Within the creation processes wizard-like steps there are three concepts worth highlighting: Date Flattening - this feature permits the roll up of reports at various date levels, such as data by week, month, quarter, or year. You simply check the box to enable it for that date field. Measures - these are your own functions that you can build into the custom subject area. They are related to the field data type and include min-max for dates, and sum(), avg(), and count() for  numeric fields. Implicit Facts - used to make the BI metadata join between your object fields and the calculated measure fields. The advice is to choose the most frequently used measure to ensure consistency. This video shows a simple example, where a simplified subject area is created for the customer 'Contact' standard object, picking just a few fields upon which users can then create reports. Custom Objects Custom subject areas support three types of custom objects. First is a simple standalone custom object and for which the same process mentioned above applies. The next is a custom child object created on a standard object parent, and finally a custom object that is related to a parent object - usually through a dynamic choice list. Whilst the steps in each of these last two are mostly the same, there are differences in the way you choose the objects and their fields. This is illustrated in the videos below.The first video shows the process for creating a custom subject area for a simple standalone custom object. This second video demonstrates how to create custom subject areas for custom objects that are of parent:child type, as well as those those with dynamic-choice-list relationships. &lt;span id=&quot;XinhaEditingPostion&quot;&gt;&lt;/span&gt; Flexfields Dynamic and Extensible Flexfields satisfy a similar requirement as custom fields (for Application Composer), with flexfields common across the Fusion Financials, Supply Chain and Procurement, and HCM applications. The basic principle is when you enable and configure your flexfields, in the edit page under each segment region (for both global and context segments) there is a BI Enabled check box. Once this is checked and you've completed your configuration, you run the Scheduled Process job named 'Import Oracle Fusion Data Extensions for Transactional Business Intelligence' to generate and migrate the related BI artifacts and data. This applies for dynamic, key, and extensible flexfields. Of course there is more to consider in terms of how you wish your flexfields to be implemented and exposed in your reports, and details are given in Chapter 4 of the Extending Applications guide.

    Read the article

  • Curious About Oracle's BI and Analytics Strategy?

    - by Bob Zurek
    Normally we use this blog space for discussing our business intelligence and analytic efforts along with our views and perspective on this very fast growing marketplace. However, I can't resist mentioning that we are having a great webcast coming up next week, so please do join Oracle's Mark Hurd and Balaji Yelamanchili as they unveil the latest advances in Oracle's strategy for placing analytics into the hands of every one of your decision makers-so that they can see more, think smarter, and act faster. Register now at http://bit.ly/HpAOJk for the Webcast and Live Chat: Wednesday, April 4, 2012 at 9 a.m. PT, 12 p.m. ET, 10 a.m. GMT.  You don't want to miss this event and thank you very much. 

    Read the article

  • Introduction to Oracle's Primavera P6 Analytics

    Primavera P6 Analytics is a new product offering from Oracle which utilizes the Oracle BI product, Oracle Business Intelligence Suite Enterprise Edition Plus (Oracle BI EE Plus), to provide root-cause analysis, manage by exception and early-warning problem indicators from your P6 Enterprise Project Portfolio Management projects and portfolios. Out of the box reports and dashboards provide drill-down and drill-through insight to action where you can jump directly into the problem areas in Primavera P6 Enterprise Project Portfolio Management to course-correct or implement best practices based on successful project implementations. A complete view into all of your projects histories provides the trends and analysis needed to identify the best course of action to take next.

    Read the article

  • Latest Business Analytics Support News has been released

    - by paul.a
    The latest edition of Business Analytics Support News is now available. You can read our quarterly newsletter Volume 5 Doc ID 1347131.1Featured topics include details on Smartview 64-Bit Support Patch Set Update information for various products Social Media for EPM Documentation plus more Did you miss any of the newsletters and want to find archived copies?  Volumes one to four are all listed on the News Index Doc ID 1347159.1 To ensure you don't miss out on future releases and recent changes you can subscribe to the Product News.More details about the subscription are outlined in the volume 5 "Subscribe to Product Support News" section.

    Read the article

  • Importing Data from Google Analytics

    - by Adam Tannon
    I am planning on building a web app with many different public-facing HTTP servers; each of which will have Google Analytics (GA) installed on them. I'd like to create a "dashboard" app that consolidates the GA data into one screen. I've been perusing the documentation for this so-called GA API, but I can't tell what the end result of the GA API is: Does the GA API allow me to do exactly what I am looking for it to do? Or... Does the GA API do something entirely different (like allow me to share my data with Google+ or something else weird) Since an API can be used to CRUD any kind of data, I guess I'm asking which way the GA API goes: is it for querying (reading) data from 1+ server instances, or is it for modifying data on those servers or somewhere else? Thanks in advance!

    Read the article

  • Exalytics Webcast - Extreme Analytics Without Limits

    - by Rob Reynolds
    Event Date: October 18, 2012 Event Time: 11 a.m. PT / 2 p.m. ET If your organization is like most, you grapple with an ongoing struggle to obtain timely and relevant information from your enterprise systems. So, while you may have the data needed to answer key questions, the volume, complexity, and dispersal of that data makes getting those answers tough. Attend this Webcast to learn how the combination of Oracle Exalytics In- Memory Machine and Oracle’s market leading analytic applications enables you to go beyond the traditional boundaries of data analysis and get the insight you need from massive volumes of data – all at the speed of thought. See how you can benefit from running your analytic applications on Oracle Exalytics to: Lower TCO Improve Operational Decision Making and Enhance Competitive Advantage Deliver Speed-of-thought Analysis – Anytime, Anywhere Register today. Learn how Oracle Business Analytics can move your business ahead. Register Here

    Read the article

  • Google Analytics

    - by DavidMadden
    My first post.  Working with Google Analytics (GA).  What an incredible tool this is for those that are wanting to know about their site traffic.  GA allows the user to drill down to the screen size of any mobile sources that came in contact with his site.  The user is even able to know region demographics of visitors and the types of browsers and languages being used.  This is a great tool to help determine your target audience and what direction of growth one may be needing to take.GA has Real-Time currently in beta but it already allows the user to see some information.  I can already detect that I am viewing my site from Louisville, KY and what page of my site is being accessed.  I highly recommend using GA for the sheer plethora of data available.

    Read the article

  • Google Analytics - New & Old Domain Filtering

    - by DotNetStudent
    I have two domains associated with the same hosting account. Domain A was the one I had purchased when I first set up my website, but I didn't like it all that much and it was getting low ranks, and so I bought domain B and 301'd links from domain A to domain B. Now I would like to know if there is any way I can filter page views on Google Analytics based on the domain used to access the website, so that I can know if it is worth to keep the old domain associated with the account. Is there any easy way I can do this?

    Read the article

  • Telerik is the First UI Components Vendor to Support Microsoft Silverlight Analytics Framework

    Telerik Aligns RadControls for Microsoft Silverlight with Latest Microsoft Innovations Waltham, MA, March 16, 2010 Telerik, a leading vendor of development tools and user interface components for .NET, announced today that its RadControls for Microsoft Silverlight 3 provides support for the Microsoft Silverlight Analytics Framework Beta, allowing developers to benefit from the new capabilities the framework offers. The new framework, announced yesterday at Microsoft Corp.s MIX10 conference, allows...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Configure Forms based authentication in SharePoint 2010

    - by sreejukg
      Configuring form authentication is a straight forward task in SharePoint. Mostly public facing websites built on SharePoint requires form based authentication. Recently, one of the WCM implementation where I was included in the project team required registration system. Any internet user can register to the site and the site offering them some membership specific functionalities once the user logged in. Since the registration open for all, I don’t want to store all those users in Active Directory. I have decided to use Forms based authentication for those users. This is a typical scenario of form authentication in SharePoint implementation. To implement form authentication you require the following A data store where you are storing the users – technically this can be active directory, SQL server database, LDAP etc. Form authentication will redirect the user to the login page, if the request is not authenticated. In the login page, there will be controls that validate the user inputs against the configured data store. In this article, I am going to use SQL server database with ASP.Net membership API’s to configure form based authentication in SharePoint 2010. This article assumes that you have SQL membership database available. I already configured the membership and roles database using aspnet_regsql command. If you want to know how to configure membership database using aspnet_regsql command, read the below blog post. http://weblogs.asp.net/sreejukg/archive/2011/06/16/usage-of-aspnet-regsql-exe-in-asp-net-4.aspx The snapshot of the database after implementing membership and role manager is as follows. I have used the database name “aspnetdb_claim”. Make sure you have created the database and make sure your database contains tables and stored procedures for membership. Create a web application with claims based authentication. This article assumes you already created a web application using claims based authentication. If you want to enable forms based authentication in SharePoint 2010, you must enable claims based authentication. Read this post for creating a web application using claims based authentication. http://weblogs.asp.net/sreejukg/archive/2011/06/15/create-a-web-application-in-sharepoint-2010-using-claims-based-authentication.aspx  You make sure, you have selected enable form authentication, and then selected Membership provider and Role manager name. To make sure you are done with the configuration, navigate to central administration website, from central administration, navigate to the Web Applications page, select the web application and click on icon, you will see the authentication providers for the current web application. Go to the section Claims authentication types, and make sure you have enabled forms based authentication. As mentioned in the snapshot, I have named the membership provider as SPFormAuthMembership and role manager as SPFormAuthRoleManager. You can choose your own names as you need. Modify the configuration files(Web.Config) to enable form authentication There are three applications that needs to be configured to support form authentication. The following are those applications. Central Administration If you want to assign permissions to web application using the credentials from form authentication, you need to update Central Administration configuration. If you do not want to access form authentication credentials from Central Administration, just leave this step.  STS service application Security Token service is the service application that issues security token when users are logging in. You need to modify the configuration of STS application to make sure users are able to login. To find the STS application, follow the following steps Go to the IIS Manager Expand the sites Node, you will see SharePoint Web Services Expand SharePoint Web Services, you can see SecurityTokenServiceApplication Right click SecuritytokenServiceApplication and click explore, it will open the corresponding file system. By default, the path for STS is C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\14\WebServices\SecurityToken You need to modify the configuration file available in the mentioned location. The web application that needs to be enabled with form authentication. You need to modify the configuration of your web application to make sure your web application identifies users from the form authentication.   Based on the above, I am going to modify the web configuration. At end of each step, I have mentioned the expected output. I recommend you to go step by step and after each step, make sure the configuration changes are working as expected. If you do everything all together, and test your application at the end, you may face difficulties in troubleshooting the configuration errors. Modifications for Central Administration Web.Config Open the web.config for the Central administration in a text editor. I always prefer Visual Studio, for editing web.config. In most cases, the path of the web.config for the central administration website is as follows C:\inetpub\wwwroot\wss\VirtualDirectories\<port number> Make sure you keep a backup copy of the web.config, before editing it. Let me summarize what we are going to do with Central Administration web.config. First I am going to add a connection string that points to the form authentication database, that I created as mentioned in previous steps. Then I need to add a membership provider and a role manager with the corresponding connectionstring. Then I need to update the peoplepickerwildcards section to make sure the users are appearing in search results. By default there is no connection string available in the web.config of Central Administration. Add a connection string just after the configsections element. The below is the connection string I have used all over the article. <add name="FormAuthConnString" connectionString="Initial Catalog=yourdatabasename;data source=databaseservername;Integrated Security=SSPI;" /> Once you added the connection string, the web.config look similar to Now add membership provider to the code. In web.config for CA, there will be <membership> tag, search for it. You will find membership and role manager under the <system.web> element. Under the membership providers section add the below code… <add name="SPFormAuthMembership" type="System.Web.Security.SqlMembershipProvider, System.Web, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" applicationName="FormAuthApplication" connectionStringName="FormAuthConnString" /> After adding memberhip element, see the snapshot of the web.config. Now you need to add role manager element to the web.config. Insider providers element under rolemanager, add the below code. <add name="SPFormAuthRoleManager" type="System.Web.Security.SqlRoleProvider, System.Web, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" applicationName="FormAuthApplication" connectionStringName="FormAuthConnString" /> After adding, your role manager will look similar to the following. As a last step, you need to update the people picker wildcard element in web.config, so that the users from your membership provider are available for browsing in Central Administration. Search for PeoplePickerWildcards in the web.config, add the following inside the <PeoplePickerWildcards> tag. <add key="SPFormAuthMembership" value="%" /> After adding this element, your web.config will look like After completing these steps, you can browse the users available in the SQL server database from central administration website. Go to the site collection administrator’s page from central administration. Select the site collection you have created for form authentication. Click on the people picker icon, choose Forms Auth and click on the search icon, you will see the users listed from the SQL server database. Once you complete these steps, make sure the users are available for browsing from central administration website. If you are unable to find the users, there must be some errors in the configuration, check windows event logs to find related errors and fix them. Change the web.config for STS application Open the web.config for STS application in text editor. By default, STS web.config does not have system.Web or connectionstrings section. Just after the System.Webserver element, add the following code. <connectionStrings> <add name="FormAuthConnString" connectionString="Initial Catalog=aspnetdb_claim;data source=sp2010_db;Integrated Security=SSPI;" /> </connectionStrings> <system.web> <roleManager enabled="true" cacheRolesInCookie="false" cookieName=".ASPXROLES" cookieTimeout="30" cookiePath="/" cookieRequireSSL="false" cookieSlidingExpiration="true" cookieProtection="All" createPersistentCookie="false" maxCachedResults="25"> <providers> <add name="SPFormAuthRoleManager" type="System.Web.Security.SqlRoleProvider, System.Web, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" applicationName="FormAuthApplication" connectionStringName="FormAuthConnString" /> </providers> </roleManager> <membership userIsOnlineTimeWindow="15" hashAlgorithmType=""> <providers> <add name="SPFormAuthMembership" type="System.Web.Security.SqlMembershipProvider, System.Web, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" applicationName="FormAuthApplication" connectionStringName="FormAuthConnString" /> </providers> </membership> </system.web> See the snapshot of the web.config after adding the required elements. After adding this, you should be able to login using the credentials from SQL server. Try assigning a user as primary/secondary administrator for your site collection from Central Administration and login to your site using form authentication. If you made everything correct, you should be able to login. This means you have successfully completed configuration of STS Configuration of Web Application for Form Authentication As a last step, you need to modify the web.config of the form authentication web application. Once you have done this, you should be able to grant permissions to users stored in the membership database. Open the Web.config of the web application you created for form authentication. You can find the web.config for the application under the path C:\inetpub\wwwroot\wss\VirtualDirectories\<port number> Basically you need to add connection string, membership provider, role manager and update the people picker wild card configuration. Add the connection string (same as the one you added to the web.config in Central Administration). See the screenshot after the connection string has added. Search for <membership> in the web.config, you will find this inside system.web element. There will be other providers already available there. You add your form authentication membership provider (similar to the one added to Central Administration web.config) to the provider element under membership. Find the snapshot of membership configuration as follows. Search for <roleManager> element in web.config, add the new provider name under providers section of the roleManager element. See the snapshot of web.config after new provider added. Now you need to configure the peoplepickerwildcard configuration in web.config. As I specified earlier, this is to make sure, you can locate the users by entering a part of their username. Add the following line under the <PeoplePickerWildcards> element in web.config. See the screenshot of the peoplePickerWildcards element after the element has been added. Now you have completed all the setup for form authentication. Navigate to the web application. From the site actions -> site settings -> go to peope and groups Click on new -> add users, it will popup the people picker dialog. Click on the icon, select Form Auth, enter a username in the search textbox, and click on search icon. See the screenshot of admin search when I tried searching the users If it displays the user, it means you are done with the configuration. If you add users to the form authentication database, the users will be able to access SharePoint portal as normal.

    Read the article

  • How to debug a web service written in PHP?

    - by nightcoder
    Hello, I've got a nice question here :) I need to debug my web service written in PHP. Its client is written in C#. After a couple of days of searching I realized this is not an easy task. At least it seems nobody knows the right solution. What is the problem in, actually? We have 2 popular PHP debugging libraries : PHP Debugger from NuSphere and XDebug extension. The problem is they both are controlled from URL query string or with the help of cookies. For example, to enable debugging with PHP Debugger you need to add ?DBGSESSID=xxx parameter to your URL or to have DBGSESSID cookie. But when your web service is called from the external client, the client doesn't have a cookie and doesn't add DBGSESSID url parameter. So how can we debug in this situation? PS. I don't want to write to log files, see request and response headers/data or something like this. I want normal step-by-step debugging and breakpoints. Anyone?

    Read the article

  • What makes good web form styling for business applications?

    - by ProfK
    Styling forms (form elements) is something that even Eric Meyer prefers to avoid. However, most business forms, and that is where styling is at issue; 'contact us' forms are easy to style, put window estate at a premium, with more 'document level' (e.g. invoice) fields, plus 'detail level' (e.g. invoice line) fields. Factors I often find at play are: At my minimum, at least two horizontally adjacent fieldsets are required. In applications vs. public web pages, fixed positioning vs fluid layout is often better. Quantity of content is important, vs. exaggerated readability. Users know the system, and cues etc. take a back seat. In light of factors like these, is there any available guidence for styling web form based applications? Are there any CSS or JavaScript frameworks that would make my quest to style these applications better than Visual Studios still pathetic 'Auto-format' (what drugs were those people on? I will never take them.)

    Read the article

  • Any web services with APIs for outsourcing webapp transactional email? [closed]

    - by Tauren
    My webapp needs to send customized messages to members and I'm wondering if there is an inexpensive and easy to use web service that would meet my needs. The types of mail I will be sending include: New account activation email (sent ASAP) Daily status report of user's account (sent anytime) Event reminders (sent at specific time) Specifically, I would like the following features: RESTful API to add an email message into the send queue A way to add a priority to each message (account signup activations should be sent immediately, while a daily status report could be sent anytime each day) They manage the sending of mail and the processing of bounces Possibly, they manage opt-out/opt-in features They offer features such as DKIM, VERP, etc. If their service determines an address is undeliverable (via VERP or other means, a user unsubscribes, etc), they make a RESTful call to my web service to notify me Nice if they had some reporting features, WebBugs, link tracking, etc. What I am NOT looking for is an email marketing service that caters only to sending out copies of the same mail to masses of recipients. I need to send out unique and custom messages to individuals. I had a chat with MailChimp about using their services for sending these types of messages and they said their service does not support customized emails per recipient. Edit: I just discovered a service called JangoSMTP that appears to meet many of my requirements. It provides an API for sending mail, supports DKIM, bounce management, and even feedback loops. Unfortunately, their idea of inexpensive doesn't mesh with mine, as it would cost $180/mo to send a single daily message to my 1000 users.

    Read the article

  • How to install VMware tools for Ubuntu 11.04 hosted on VMware ESXi?

    - by Dmitri Toubelis
    I'm running Vmware ESX 4.1 and I have a development VM that I recently upgraded from Ubuntu 10.04 to 11.04. Then I tried to re-install VMware Tools and some of the modules gave me an error and would not compile. As a result I'm having problems with backing up this virtual machine now and I suspect VMware tools is the reason. I installed latest patches for VMware host, that included an update to VMware Tools (v8.3.7 build-381511) but I'm still getting the same error. The error I'm getting is like this: ... /tmp/vmware-root/modules/vmhgfs-only/super.c:73:4: error: unknown field \u2018clear_inode\u2019 specified in initializer make[2]: *** [/tmp/vmware-root/modules/vmhgfs-only/super.o] Error 1 make[1]: *** [_module_/tmp/vmware-root/modules/vmhgfs-only] Error 2 make[1]: Leaving directory `/usr/src/linux-headers-2.6.38-8-generic' make: *** [vmhgfs.ko] Error 2 make: Leaving directory `/tmp/vmware-root/modules/vmhgfs-only' and also this: /tmp/vmware-root/modules/vmci-only/vmci_drv.c:91:4: error: unknown field \u2018ioctl\u2019 specified in initializer /tmp/vmware-root/modules/vmci-only/vmci_drv.c:91:4: warning: initialization from incompatible pointer type /tmp/vmware-root/modules/vmci-only/vmci_drv.c: In function \u2018vmci_init\u2019: /tmp/vmware-root/modules/vmci-only/vmci_drv.c:151:4: error: implicit declaration of function \u2018init_MUTEX\u2019 make[2]: *** [/tmp/vmware-root/modules/vmci-only/vmci_drv.o] Error 1 make[1]: *** [_module_/tmp/vmware-root/modules/vmci-only] Error 2 make[1]: Leaving directory `/usr/src/linux-headers-2.6.38-8-generic' make: *** [vmci.ko] Error 2 make: Leaving directory `/tmp/vmware-root/modules/vmci-only' Any ideas?

    Read the article

  • Building and Deploying Windows Azure Web Sites using Git and GitHub for Windows

    - by shiju
    Microsoft Windows Azure team has released a new version of Windows Azure which is providing many excellent features. The new Windows Azure provides Web Sites which allows you to deploy up to 10 web sites  for free in a multitenant shared environment and you can easily upgrade this web site to a private, dedicated virtual server when the traffic is grows. The Meet Windows Azure Fact Sheet provides the following information about a Windows Azure Web Site: Windows Azure Web Sites enable developers to easily build and deploy websites with support for multiple frameworks and popular open source applications, including ASP.NET, PHP and Node.js. With just a few clicks, developers can take advantage of Windows Azure’s global scale without having to worry about operations, servers or infrastructure. It is easy to deploy existing sites, if they run on Internet Information Services (IIS) 7, or to build new sites, with a free offer of 10 websites upon signup, with the ability to scale up as needed with reserved instances. Windows Azure Web Sites includes support for the following: Multiple frameworks including ASP.NET, PHP and Node.js Popular open source software apps including WordPress, Joomla!, Drupal, Umbraco and DotNetNuke Windows Azure SQL Database and MySQL databases Multiple types of developer tools and protocols including Visual Studio, Git, FTP, Visual Studio Team Foundation Services and Microsoft WebMatrix Signup to Windows and Enable Azure Web Sites You can signup for a 90 days free trial account in Windows Azure from here. After creating an account in Windows Azure, go to https://account.windowsazure.com/ , and select to preview features to view the available previews. In the Web Sites section of the preview features, click “try it now” which will enables the web sites feature Create Web Site in Windows Azure To create a web sites, login to the Windows Azure portal, and select Web Sites from and click New icon from the left corner  Click WEB SITE, QUICK CREATE and put values for URL and REGION dropdown. You can see the all web sites from the dashboard of the Windows Azure portal Set up Git Publishing Select your web site from the dashboard, and select Set up Git publishing To enable Git publishing , you must give user name and password which will initialize a Git repository Clone Git Repository We can use GitHub for Windows to publish apps to non-GitHub repositories which is well explained by Phil Haack on his blog post. Here we are going to deploy the web site using GitHub for Windows. Let’s clone a Git repository using the Git Url which will be getting from the Windows Azure portal. Let’s copy the Git url and execute the “git clone” with the git url. You can use the Git Shell provided by GitHub for Windows. To get it, right on the GitHub for Windows, and select open shell here as shown in the below picture. When executing the Git Clone command, it will ask for a password where you have to give password which specified in the Windows Azure portal. After cloning the GIT repository, you can drag and drop the local Git repository folder to GitHub for Windows GUI. This will automatically add the Windows Azure Web Site repository onto GitHub for Windows where you can commit your changes and publish your web sites to Windows Azure. Publish the Web Site using GitHub for Windows We can add multiple framework level files including ASP.NET, PHP and Node.js, to the local repository folder can easily publish to Windows Azure from GitHub for Windows GUI. For this demo, let me just add a simple Node.js file named Server.js which handles few request handlers. 1: var http = require('http'); 2: var port=process.env.PORT; 3: var querystring = require('querystring'); 4: var utils = require('util'); 5: var url = require("url"); 6:   7: var server = http.createServer(function(req, res) { 8: switch (req.url) { //checking the request url 9: case '/': 10: homePageHandler (req, res); //handler for home page 11: break; 12: case '/register': 13: registerFormHandler (req, res);//hamdler for register 14: break; 15: default: 16: nofoundHandler (req, res);// handler for 404 not found 17: break; 18: } 19: }); 20: server.listen(port); 21: //function to display the html form 22: function homePageHandler (req, res) { 23: console.log('Request handler home was called.'); 24: res.writeHead(200, {'Content-Type': 'text/html'}); 25: var body = '<html>'+ 26: '<head>'+ 27: '<meta http-equiv="Content-Type" content="text/html; '+ 28: 'charset=UTF-8" />'+ 29: '</head>'+ 30: '<body>'+ 31: '<form action="/register" method="post">'+ 32: 'Name:<input type=text value="" name="name" size=15></br>'+ 33: 'Email:<input type=text value="" name="email" size=15></br>'+ 34: '<input type="submit" value="Submit" />'+ 35: '</form>'+ 36: '</body>'+ 37: '</html>'; 38: //response content 39: res.end(body); 40: } 41: //handler for Post request 42: function registerFormHandler (req, res) { 43: console.log('Request handler register was called.'); 44: var pathname = url.parse(req.url).pathname; 45: console.log("Request for " + pathname + " received."); 46: var postData = ""; 47: req.on('data', function(chunk) { 48: // append the current chunk of data to the postData variable 49: postData += chunk.toString(); 50: }); 51: req.on('end', function() { 52: // doing something with the posted data 53: res.writeHead(200, "OK", {'Content-Type': 'text/html'}); 54: // parse the posted data 55: var decodedBody = querystring.parse(postData); 56: // output the decoded data to the HTTP response 57: res.write('<html><head><title>Post data</title></head><body><pre>'); 58: res.write(utils.inspect(decodedBody)); 59: res.write('</pre></body></html>'); 60: res.end(); 61: }); 62: } 63: //Error handler for 404 no found 64: function nofoundHandler(req, res) { 65: console.log('Request handler nofound was called.'); 66: res.writeHead(404, {'Content-Type': 'text/plain'}); 67: res.end('404 Error - Request handler not found'); 68: } .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; } If there is any change in the local repository folder, GitHub for Windows will automatically detect the changes. In the above step, we have just added a Server.js file so that GitHub for Windows will detect the changes. Let’s commit the changes to the local repository before publishing the web site to Windows Azure. After committed the all changes, you can click publish button which will publish the all changes to Windows Azure repository. The following screen shot shows deployment history from the Windows Azure portal.   GitHub for Windows is providing a sync button which can use for synchronizing between local repository and Windows Azure repository after making any commit on the local repository after any changes. Our web site is running after the deployment using Git Summary Windows Azure Web Sites lets the developers to easily build and deploy websites with support for multiple framework including ASP.NET, PHP and Node.js and can easily deploy the Web Sites using Visual Studio, Git, FTP, Visual Studio Team Foundation Services and Microsoft WebMatrix. In this demo, we have deployed a Node.js Web Site to Windows Azure using Git. We can use GitHub for Windows to publish apps to non-GitHub repositories and can use to publish Web SItes to Windows Azure.

    Read the article

  • Free Tools for Network Super-Heroes!

    - by TATWORTH
    At http://www.solarwinds.com/products/solarwinds_free_tools/ there is a comprehensive list of free tools, including the IP Address Tracker that I previously blogged about. Suggest this list to your network administrators! The tools include: http://www.solarwinds.com/products/freetools/permissions_analyzer_for_active_directory/ WMI Monitor VM Console Real-Time NetFlow Analyzer Network Device Monitor Network Config Generator TFTP Server IP Address Tracker VM Monitor Advanced Subnet Calculator Wake-On-Lan

    Read the article

  • Google I/O 2011: JavaScript Programming in the Large with Closure Tools

    Google I/O 2011: JavaScript Programming in the Large with Closure Tools Michael Bolin Most developers who have tinkered with JavaScript could not imagine writing 1000 lines of code in such a language, let alone 100000. Yet that is exactly what Google engineers have done using a suite of JavaScript tools named "Closure" to produce many of the most popular and sophisticated applications on the Web, such as Gmail and Google Maps. From: GoogleDevelopers Views: 4915 35 ratings Time: 57:07 More in Science & Technology

    Read the article

  • Need help with DNS. Registrar is NS, Web Site at WinHost, Email at eHost

    - by Leon
    Need help moving a web site for a client, which I will call ClientABC. The web site is ClientABC.com, which is hosted at Rackspace, with their email hosted at eHost. We are transferring the site from Rackspace to WinHost and are keeping the email hosted at eHost. I would like the transfer to happen with little to no down time for the web site and email (email is most important). Current Config: Client owns domain and registrar is Network Solutions Domain name is managed by VendorX at Rackspace Web site is hosted on Rackspace servers Email is hosted at eHost Post-Move Config: Web site is hosted at WinHost Keep Email at eHost Here is my plan for the transfer: Copy the site files to WinHost and test to assure site is fully functional Set up the MX record in the WinHost account to point to eHost servers Change the DNS in Network Solutions from Rackspace to Winhost Questions: Will this work? What am I missing? Should I expect down time or any issues with email? I understand that there will be a period of time that traffic to the site is handled at both Rackspace and Winhost and that email traffic will be routed through both hosts as well. Will this cause issues? How will I know when the change is fully propagated and that Rackspace is out of the equation and WinHost is handling everything (so I can kill the Rackspace account) Thanks in advance!

    Read the article

  • Specialized course in Web programming or Generalized course in Computer Science?

    - by Sugan
    I am planning to do my masters in Computer Science in UK. I am interested in Web programming. What should I opt for. A general course in Computer Science or in some specialized course in Web Programming. If web programming, are there a lot of colleges offering these courses. Are there a lot of job offers there in UK. I am not sure whether I can ask this question here. If not point me where I can ask this question.

    Read the article

  • SQL University: Database testing and refactoring tools and examples

    - by Mladen Prajdic
    This is a post for a great idea called SQL University started by Jorge Segarra also famously known as SqlChicken on Twitter. It’s a collection of blog posts on different database related topics contributed by several smart people all over the world. So this week is mine and we’ll be talking about database testing and refactoring. In 3 posts we’ll cover: SQLU part 1 - What and why of database testing SQLU part 2 - What and why of database refactoring SQLU part 3 - Database testing and refactoring tools and examples This is the third and last part of the series and in it we’ll take a look at tools we can test and refactor with plus some an example of the both. Tools of the trade First a few thoughts about how to go about testing a database. I'm firmily against any testing tools that go into the database itself or need an extra database. Unit tests for the database and applications using the database should all be in one place using the same technology. By using database specific frameworks we fragment our tests into many places and increase test system complexity. Let’s take a look at some testing tools. 1. NUnit, xUnit, MbUnit All three are .Net testing frameworks meant to unit test .Net application. But we can test databases with them just fine. I use NUnit because I’ve always used it for work and personal projects. One day this might change. So the thing to remember is to be flexible if something better comes along. All three are quite similar and you should be able to switch between them without much problem. 2. TSQLUnit As much as this framework is helpful for the non-C# savvy folks I don’t like it for the reason I stated above. It lives in the database and thus fragments the testing infrastructure. Also it appears that it’s not being actively developed anymore. 3. DbFit I haven’t had the pleasure of trying this tool just yet but it’s on my to-do list. From what I’ve read and heard Gojko Adzic (@gojkoadzic on Twitter) has done a remarkable job with it. 4. Redgate SQL Refactor and Apex SQL Refactor Neither of these refactoring tools are free, however if you have hardcore refactoring planned they are worth while looking into. I’ve only used the Red Gate’s Refactor and was quite impressed with it. 5. Reverting the database state I’ve talked before about ways to revert a database to pre-test state after unit testing. This still holds and I haven’t changed my mind. Also make sure to read the comments as they are quite informative. I especially like the idea of setting up and tearing down the schema for each test group with NHibernate. Testing and refactoring example We’ll take a look at the simple schema and data test for a view and refactoring the SELECT * in that view. We’ll use a single table PhoneNumbers with ID and Phone columns. Then we’ll refactor the Phone column into 3 columns Prefix, Number and Suffix. Lastly we’ll remove the original Phone column. Then we’ll check how the view behaves with tests in NUnit. The comments in code explain the problem so be sure to read them. I’m assuming you know NUnit and C#. T-SQL Code C# test code USE tempdbGOCREATE TABLE PhoneNumbers( ID INT IDENTITY(1,1), Phone VARCHAR(20))GOINSERT INTO PhoneNumbers(Phone)SELECT '111 222333 444' UNION ALLSELECT '555 666777 888'GO-- notice we don't have WITH SCHEMABINDINGCREATE VIEW vPhoneNumbersAS SELECT * FROM PhoneNumbersGO-- Let's take a look at what the view returns -- If we add a new columns and rows both tests will failSELECT *FROM vPhoneNumbers GO -- DoesViewReturnCorrectColumns test will SUCCEED -- DoesViewReturnCorrectData test will SUCCEED -- refactor to split Phone column into 3 partsALTER TABLE PhoneNumbers ADD Prefix VARCHAR(3)ALTER TABLE PhoneNumbers ADD Number VARCHAR(6)ALTER TABLE PhoneNumbers ADD Suffix VARCHAR(3)GO-- update the new columnsUPDATE PhoneNumbers SET Prefix = LEFT(Phone, 3), Number = SUBSTRING(Phone, 5, 6), Suffix = RIGHT(Phone, 3)GO-- remove the old columnALTER TABLE PhoneNumbers DROP COLUMN PhoneGO-- This returns unexpected results!-- it returns 2 columns ID and Phone even though -- we don't have a Phone column anymore.-- Notice that the data is from the Prefix column-- This is a danger of SELECT *SELECT *FROM vPhoneNumbers -- DoesViewReturnCorrectColumns test will SUCCEED -- DoesViewReturnCorrectData test will FAIL -- for a fix we have to call sp_refreshview -- to refresh the view definitionEXEC sp_refreshview 'vPhoneNumbers'-- after the refresh the view returns 4 columns-- this breaks the input/output behavior of the database-- which refactoring MUST NOT doSELECT *FROM vPhoneNumbers -- DoesViewReturnCorrectColumns test will FAIL -- DoesViewReturnCorrectData test will FAIL -- to fix the input/output behavior change problem -- we have to concat the 3 columns into one named PhoneALTER VIEW vPhoneNumbersASSELECT ID, Prefix + ' ' + Number + ' ' + Suffix AS PhoneFROM PhoneNumbersGO-- now it works as expectedSELECT *FROM vPhoneNumbers -- DoesViewReturnCorrectColumns test will SUCCEED -- DoesViewReturnCorrectData test will SUCCEED -- clean upDROP VIEW vPhoneNumbersDROP TABLE PhoneNumbers [Test]public void DoesViewReturnCoorectColumns(){ // conn is a valid SqlConnection to the server's tempdb // note the SET FMTONLY ON with which we return only schema and no data using (SqlCommand cmd = new SqlCommand("SET FMTONLY ON; SELECT * FROM vPhoneNumbers", conn)) { DataTable dt = new DataTable(); dt.Load(cmd.ExecuteReader(CommandBehavior.CloseConnection)); // test returned schema: number of columns, column names and data types Assert.AreEqual(dt.Columns.Count, 2); Assert.AreEqual(dt.Columns[0].Caption, "ID"); Assert.AreEqual(dt.Columns[0].DataType, typeof(int)); Assert.AreEqual(dt.Columns[1].Caption, "Phone"); Assert.AreEqual(dt.Columns[1].DataType, typeof(string)); }} [Test]public void DoesViewReturnCorrectData(){ // conn is a valid SqlConnection to the server's tempdb using (SqlCommand cmd = new SqlCommand("SELECT * FROM vPhoneNumbers", conn)) { DataTable dt = new DataTable(); dt.Load(cmd.ExecuteReader(CommandBehavior.CloseConnection)); // test returned data: number of rows and their values Assert.AreEqual(dt.Rows.Count, 2); Assert.AreEqual(dt.Rows[0]["ID"], 1); Assert.AreEqual(dt.Rows[0]["Phone"], "111 222333 444"); Assert.AreEqual(dt.Rows[1]["ID"], 2); Assert.AreEqual(dt.Rows[1]["Phone"], "555 666777 888"); }}   With this simple example we’ve seen how a very simple schema can cause a lot of problems in the whole application/database system if it doesn’t have tests. Imagine what would happen if some outside process would depend on that view. It would get wrong data and propagate it silently throughout the system. And that is not good. So have tests at least for the crucial parts of your systems. And with that we conclude the Database Testing and Refactoring week at SQL University. Hope you learned something new and enjoy the learning weeks to come. Have fun!

    Read the article

  • Windows Phone 7 Developer Tools - January 2011 Update

    - by Nikita Polyakov
    Long time no talk? So to make up for it, here is something very new – update to WP7 Dev Tools! The Windows Phone 7 Developer Tools January 2011 Update provides bits that you would install on TOP of the current WP7 Dev tools on your machine. If you are just installing the tools for the first time, this update replaces previously released October patch. In fact, it is no longer available as this January 2011 update replaces the patch entirely. What is in this update? TextBox support for Copy&Paste Updated Emulator Image that contains Copy&Paste for your testing There have been performance tweaks for the OS Minor Bugs and Fixes How does it Work? The Copy&Paste extends a existing TextBox control to have this new functionality, There is no current API access to the Clipboard or support for other controls that are not based on TextBox. If I have/Do I need to: A current application in the marketplace/No action is required Have an application that contains a TextBox in a Pivot or Panorama control surface/Text your application in provided emulator Recommendation is to move TextBox controls from directly top of controls that listen to Gesture movement to their own pop-off screens or entire pages as this might interfear with select behavior for Copy&Paste Have controls that do no inherit from TextBox/Such controls will not get new Copy&Paste behavior Note: The update materials, FAQ and Q&A do not answer WHEN the update for the OS will be sent to the phones.  Also to note - this update does NOT update your developer phone to enable Copy&Paste or any other features. Windows Phone 7 Training Kit February Update Windows Phone Training Kit has also been updated – you can grab a fresh copy here.   Where to I find more good information, documentation and training? This very awesome blog post from the Windows Phone Developer Blog - Windows Phone 7 Documentation Landscape. Official Blog Post on the Update is here. Happy coding! -Nikita   PS: I am well aware that it is Feb 4th and not January :) If you were disappointed at CES that Microsoft said nothing at all about future of WP7, don’t forget that MWC 2011 is Feb 14th – I am going to be listening for Windows Phone announcements then, as that is where the announcements were made about Windows Phone 7.

    Read the article

  • Why did Alan Kay say, "The Internet was so well done, but the web was by amateurs"?

    - by kalaracey
    OK, so I paraphrased. The full quote: The Internet was done so well that most people think of it as a natural resource like the Pacific Ocean, rather than something that was man-made. When was the last time a technology with a scale like that was so error-free? The Web, in comparison, is a joke. The Web was done by amateurs. -- Alan Kay. I am trying to understand the history of the Internet and the web, and this statement is hard to understand. I have read elsewhere that the Internet is now used for very different things than it was designed for, and so perhaps that factors in. What makes the Internet so well done, and what makes the web so amateurish? (Of course, Alan Kay is fallible, and no one here is Alan Kay, so we can't know precisely why he said that, but what are some possible explanations?) *See also the original interview*.

    Read the article

  • Keyword Selector Tools - Choosing the Best Tool For Your Keyword Research

    Keyword selector tools are crucial in being able to locate keywords and keyword phrases which, if used wisely, will increase traffic to a site. Increased traffic generally means increased profits so it is important to pick the keyword research tool that best suits your needs. However the choice of tools is vast. Some are free, some cost a one off fee whilst others require a monthly subscription. This article gives tips and ideas to consider when choosing a keyword selector tool.

    Read the article

  • Search Engine Optimization Software Tools

    In the Internet marketing field there are several essential search engine optimization software tools available for purchase on the Internet today. Professional SEO software tools identify niche markets for digital products for sale and allow the user analyze the marketplace and assess the competition.

    Read the article

< Previous Page | 66 67 68 69 70 71 72 73 74 75 76 77  | Next Page >