Search Results

Search found 32454 results on 1299 pages for 'google webmaster tools'.

Page 202/1299 | < Previous Page | 198 199 200 201 202 203 204 205 206 207 208 209  | Next Page >

  • Google Chrome shows error messagebox every time it starts.

    - by Benjamin
    I removed Google Chrome 10.x dev version, and installed 8.x stable version again. After installing 8.x, chrome always shows this message box, every time it starts. Your profile can not be used because it is from a newer version of Google Chrome. Some features may be unavailable. Please specify a different profile directory or use a newer version of Chrome. What profile does it say. How to fix it? Thanks.

    Read the article

  • What's the syntax to add a calendar reminder from Google Command Line?

    - by Traveling Tech Guy
    I've been using googlecl successfully to add events to my calendar. Things like: google calendar add "call Paul tomorrow at 8:30am" work great, and add the appropriate event t the right time. But no reminder is added for the event. I tried: google calendar add "call Paul tomorrow at 8:30am reminder 10 minutes" and other combinations. It just ends up adding the "reminder" instruction to the event description. What's the syntax I should use to add a, let's say, 10 minutes pop-up reminder? Thanks

    Read the article

  • How to connect to Google App Engine server in internal network iMac?

    - by Will Merydith
    I have 3 iMacs and a Windows machine on my home network, all connected via an Airport Extreme router. I'm developing Google App Engine applications locally on one of the iMacs, and can view applications using http://localhost:8080 (or whatever port I choose). How do I connect to those applications from other iMacs and Windows machines in my network? I've located the IP for the iMac hosting Google App Engine: 10.0.1.7. But when I try http://10.0.1.7:8080 from another machine it will not load the page.

    Read the article

  • How can I migrate mails from Yahoo! Mail to Google Apps?

    - by alnorth29
    We'd like to move from using Yahoo! Mail to our Google Apps account. It'd be great to be able to migrate all the mails across from the old accounts to the new. If we were using Gmail accounts this would be easy as the migration is offered by Google, for some reason they don't offer this to those using custom domains. As far as I know Yahoo! don't offer POP or IMAP access to non-paying customers and I'd rather not pay $20 per account that I want to migrate. Anyone know of any easy and cheap solutions?

    Read the article

  • On OS X, how do I print (a web-page) directly into my Google Docs account?

    - by Peter Mounce
    Is it possible to, from the OS X 10.6 print dialog, print the web-page directly into my Google Documents account? I can save as PDF and upload it, but, well, I want to cut out the middleman (ie, me; I make too many mistakes...). I tend to use Chrome to browse the web in case that's relevant. Are there Google Docs upload-helpers in general? I tried gdocuploaded (an Adobe AIR App) but it refused to so much as launch.

    Read the article

  • Can Google App users view Exchange users public calendars and contacts?

    - by CT
    My company currently uses MS Exchange 2003 for company email, contacts, and calendars. We have approximately 150 users. Construction industry. I would like to look into migrating from Exchange to Google Apps. It will be an easier sell to the powers that be if we can migrate certain smaller departments first successfully than an entire company move. I would like to first migrate our field superintendents who are usually out of the office working remotely. Approx 30 users. Will Google App users be able to see our Exchange user's calendars and vice versa? How about public folders? Anyone's migration story is much appreciated. Thank you.

    Read the article

  • Google Rules for Retail

    - by David Dorf
    In the book What Would Google Do?, Jeff Jarvis outlines ten "Google Rules" that define how Google acts.  These rules help define how Web 2.0 businesses operate today and into the future.  While there's a chapter in the book on applying these rules to the retail industry, it wasn't very in-depth.  So I've decided to more directly apply the rules to retail, along with some notable examples of success.  The table below shows Jeff's Google Rule, some Industry Examples, and New Retailer Rules that I created. Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin;} table.MsoTableGrid {mso-style-name:"Table Grid"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-priority:59; mso-style-unhide:no; border:solid black 1.0pt; mso-border-themecolor:text1; mso-border-alt:solid black .5pt; mso-border-themecolor:text1; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-border-insideh:.5pt solid black; mso-border-insideh-themecolor:text1; mso-border-insidev:.5pt solid black; mso-border-insidev-themecolor:text1; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin;} Google Rule Industry Examples New Retailer Rule New Relationship Your worst customer is your friend; you best customer is your partner Newegg.com lets manufacturers respond to customer comments that are critical of the product, and their EggXpert site lets customers help other customers. Listen to what your customers are saying about you.  Convert the critics to fans and the fans to influencers. New Architecture Join a network; be a platform Tesco and BestBuy released APIs for their product catalogs so third-parties could create new applications. Become a destination for information. New Publicness Life is public, so is business Zappos and WholeFoods founders are prolific tweeters/bloggers, sharing their opinions and connecting to customers.  It's not always pretty, but it's genuine. Be transparent.  Share both your successes and failures with your customers. New Society Elegant organization Wet Seal helps their customers assemble outfits and show them off to each other.  Barnes & Noble has a community site that includes a bookclub. Communities of your customers already exist, so help them organize better. New Economy Mass market is dead; long live the mass of niches lululemon found a niche for yoga inspired athletic wear.  Threadless uses crowd-sourcing to design short-runs of T-shirts. Serve small markets with niche products. New Business Reality Decide what business you're in When Lowes realized catering to women brought the men along, their sales increased. Customers want experiences to go with the products they buy. New Attitude Trust the people and listen In 2008 Starbucks launched MyStartbucksIdea to solicit ideas from their customers. Use social networks as additional data points for making better merchandising decisions. New Ethic Be honest and transparent; don't be evil Target is giving away reusable shopping bags for Earth Day.  Kohl's has outfitted 67 stores with solar arrays. Being green earns customers' respect and lowers costs too. New Speed Life is live H&M and Zara keep up with fashion trends. Be prepared to pounce on you customers' fickle interests. New Imperatives Encourage, enable and protect innovation 1-800-Flowers was the first do sales in Facebook and an early adopter of mobile commerce.  The Sears Personal Shopper mobile app finds products based on a photo. Give your staff permission to fail so innovation won't be stifled. Jeff will be a keynote speaker at Crosstalk, our upcoming annual user conference, so I'm looking forward to hearing more of his perspective on retail and the new economy.

    Read the article

  • Continuous Integration for SQL Server Part II – Integration Testing

    - by Ben Rees
    My previous post, on setting up Continuous Integration for SQL Server databases using GitHub, Bamboo and Red Gate’s tools, covered the first two parts of a simple Database Continuous Delivery process: Putting your database in to a source control system, and, Running a continuous integration process, each time changes are checked in. However there is, of course, a lot more to to Continuous Delivery than that. Specifically, in addition to the above: Putting some actual integration tests in to the CI process (otherwise, they don’t really do much, do they!?), Deploying the database changes with a managed, automated approach, Monitoring what you’ve just put live, to make sure you haven’t broken anything. This post will detail how to set up a very simple pipeline for implementing the first of these (continuous integration testing). NB: A lot of the setup in this post is built on top of the configuration from before, so it might be difficult to implement this post without running through part I first. There’ll then be a third post on automated database deployment followed by a final post dealing with the last item – monitoring changes on the live system. In the previous post, I used a mixture of Red Gate products and other 3rd party software – GitHub and Atlassian Bamboo specifically. This was partly because I believe most people work in an heterogeneous environment, using software from different vendors to suit their purposes and I wanted to show how this could work for this process. For example, you could easily substitute Atlassian’s BitBucket or Stash for GitHub, depending on your needs, or use an alternative CI server such as TeamCity, TFS or Jenkins. However, in this, post, I’ll be mostly using Red Gate products only (other than tSQLt). I would do this, firstly because I work for Red Gate. However, I also think that in the area of Database Delivery processes, nobody else has the offerings to implement this process fully – so I didn’t have any choice!   Background on Continuous Delivery For me, a great source of information on what makes a proper Continuous Delivery process is the Jez Humble and David Farley classic: Continuous Delivery – Reliable Software Releases through Build, Test, and Deployment Automation This book is not of course, primarily about databases, and the process I outline here and in the previous article is a gross simplification of what Jez and David describe (not least because it’s that much harder for databases!). However, a lot of the principles that they describe can be equally applied to database development and, I would argue, should be. As I say however, what I describe here is a very simple version of what would be required for a full production process. A couple of useful resources on handling some of these complexities can be found in the following two references: Refactoring Databases – Evolutionary Database Design, by Scott J Ambler and Pramod J. Sadalage Versioning Databases – Branching and Merging, by Scott Allen In particular, I don’t deal at all with the issues of multiple branches and merging of those branches, an issue made particularly acute by the use of GitHub. The other point worth making is that, in the words of Martin Fowler: Continuous Delivery is about keeping your application in a state where it is always able to deploy into production.   I.e. we are not talking about continuously delivery updates to the production database every time someone checks in an amendment to a stored procedure. That is possible (and what Martin calls Continuous Deployment). However, again, that’s more than I describe in this article. And I doubt I need to remind DBAs or Developers to Proceed with Caution!   Integration Testing Back to something practical. The next stage, building on our set up from the previous article, is to add in some integration tests to the process. As I say, the CI process, though interesting, isn’t enormously useful without some sort of test process running. For this we’ll use the tSQLt framework, an open source framework designed specifically for running SQL Server tests. tSQLt is part of Red Gate’s SQL Test found on http://www.red-gate.com/products/sql-development/sql-test/ or can be downloaded separately from www.tsqlt.org - though I’ll provide a step-by-step guide below for setting this up. Getting tSQLt set up via SQL Test Click on the link http://www.red-gate.com/products/sql-development/sql-test/ and click on the blue Download button to download the Red Gate SQL Test product, if not already installed. Follow the install process for SQL Test to install the SQL Server Management Studio (SSMS) plugin on to your machine, if not already installed. Open SSMS. You should now see SQL Test under the Tools menu:   Clicking this link will give you the basic SQL Test dialogue: As yet, though we’ve installed the SQL Test product we haven’t yet installed the tSQLt test framework on to any particular database. To do this, we need to add our RedGateApp database using this dialogue, by clicking on the + Add Database to SQL Test… link, selecting the RedGateApp database and clicking the Add Database link:   In the next screen, SQL Test describes what will be installed on the database for the tSQLt framework. Also in this dialogue, uncheck the “Add SQL Cop tests” option (shown below). SQL Cop is a great set of pre-defined tests that work within the tSQLt framework to check the general health of your SQL Server database. However, we won’t be using them in this particular simple example: Once you’ve clicked on the OK button, the changes described in the dialogue will be made to your database. Some of these are shown in the left-hand-side below: We’ve now installed the framework. However, we haven’t actually created any tests, so this will be the next step. But, before we proceed, we’ve made an update to our database so should, again check this in to source control, adding comments as required:   Also worth a quick check that your build still runs with the new additions!: (And a quick check of the RedGateAppCI database shows that the changes have been made).   Creating and Testing a Unit Test There are, of course, a lot of very interesting unit tests that you could and should set up for a database. The great thing about the tSQLt framework is that you can write these in SQL. The example I’m going to use here is pretty Mickey Mouse – our database table is going to include some email addresses as reference data and I want to check whether these are all in a correct email format. Nothing clever but it illustrates the process and hopefully shows the method by which more interesting tests could be set up. Adding Reference Data to our Database To start, I want to add some reference data to my database, and have this source controlled (as well as the schema). First of all I need to add some data in to my solitary table – this can be done a number of ways, but I’ll do this in SSMS for simplicity: I then add some reference data to my table: Currently this reference data just exists in the database. For proper integration testing, this needs to form part of the source-controlled version of the database – and so needs to be added to the Git repository. This can be done via SQL Source Control, though first a Primary Key needs to be added to the table. Right click the table, select Design, then right-click on the first “id” row. Then click on “Set Primary Key”: NB: once this change is made, click Save to save the change to the table. Then, to source control this reference data, right click on the table (dbo.Email) and selecting the following option:   In the next screen, link the data in the Email table, by selecting it from the list and clicking “save and close”: We should at this point re-commit the changes (both the addition of the Primary Key, and the data) to the Git repo. NB: From here on, I won’t show screenshots for the GitHub side of things – it’s the same each time: whenever a change is made in SQL Source Control and committed to your local folder, you then need to sync this in the GitHub Windows client (as this is where the build server, Bamboo is taking it from). An interesting point to note here, when these changes are committed in SQL Source Control (right-click database and select “Commit Changes to Source Control..”): The display gives a warning about possibly needing a migration script for the “Add Primary Key” step of the changes. This isn’t actually necessary in this case, but this mechanism would allow you to create override scripts to replace the default change scripts created by the SQL Compare engine (which runs underneath SQL Source Control). Ignoring this message (!), we add a comment and commit the changes to Git. I then sync these, run a build (or the build gets run automatically), and check that the data is being deployed over to the target RedGateAppCI database:   Creating and Running the Test As I mention, the test I’m going to use here is a very simple one - are the email addresses in my reference table valid? This isn’t of course, a full test of email validation (I expect the email addresses I’ve chosen here aren’t really the those of the Fab Four) – but just a very basic check of format used. I’ve taken the relevant SQL from this Stack Overflow article. In SSMS select “SQL Test” from the Tools menu, then click on + New Test: In the next screen, give your new test a name, and also enter a name in the Test Class box (test classes are schemas that help you keep things organised). Also check that the database in which the test is going to be created is correct – RedGateApp in this example: Click “Create Test”. After closing a couple of subsequent dialogues, you’ll see a dummy script for the test, that needs filling in:   We now need to define the SQL for our test. As mentioned before, tSQLt allows you to write your unit tests in T-SQL, and the code I’m going to use here is as below. This needs to be copied and pasted in to the query window, to replace the default given by tSQLt: –  Basic email check test ALTER PROCEDURE [MyChecks].[test Check Email Addresses] AS BEGIN SET NOCOUNT ON         Declare @Output VarChar(max)     Set @Output = ”       SELECT  @Output = @Output + Email +Char(13) + Char(10) FROM dbo.Email WHERE email NOT LIKE ‘%_@__%.__%’       If @Output > ”         Begin             Set @Output = Char(13) + Char(10)                           + @Output             EXEC tSQLt.Fail@Output         End   END;   Once this script is entered, hit execute to add the Stored Procedure to the database. Before committing the test to source control,  it’s worth just checking that it works! For a positive test, click on “SQL Test” from the Tools menu, then click Run Tests. You should see output like the following: - a green tick to indicate success! But of course, what we also need to do is test that this is actually doing something by showing a failed test. Edit one of the email addresses in your table to an incorrect format: Now, re-run the same SQL Test as before and you’ll see the following: Great – we now know that our test is really doing something! You’ll also see a useful error message at the bottom of SSMS: (leave the email address as invalid for now, for the next steps). The next stage is to check this new test in to source control again, by right-clicking on the database and checking in the changes with a commit message (and not forgetting to sync in the GitHub client):   Checking that the Tests are Running as Integration Tests After the changes above are made, and after a build has run on Bamboo (manual or automatic), looking at the Stored Procedures for the RedGateAppCI, the SPROC for the new test has been moved over to the database. However this is not exactly what we were after. We didn’t want to just copy objects from one database to another, but actually run the tests as part of the build/integration test process. I.e. we’re continuously checking any changes we make (in this case, to the reference data emails), to ensure we’re not breaking a test that we’ve set up. The behaviour we want to see is that, if we check in static data that is incorrect (as we did in step 9 above) and we have the tSQLt test set up, then our build in Bamboo should fail. However, re-running the build shows the following: - sadly, a successful build! To make sure the tSQLt tests are run as part of the integration test, we need to amend a switch in the Red Gate CI config file. First, navigate to file sqlCI.targets in your working folder: Edit this document, make the following change, save the document, then commit and sync this change in the GitHub client: <!-- tSQLt tests --> <!-- Optional --> <!-- To run tSQLt tests in source control for the database, enter true. --> <enableTsqlt>true</enableTsqlt> Now, if we re-run the build in Bamboo (NB: I’ve moved to a new server here, hence different address and build number): - superb, a broken build!! The error message isn’t great here, so to get more detailed info, click on the full build log link on this page (below the fold). The interesting part of the log shown is towards the bottom. Pulling out this part:   21-Jun-2013 11:35:19 Build FAILED. 21-Jun-2013 11:35:19 21-Jun-2013 11:35:19 "C:\Users\Administrator\bamboo-home\xml-data\build-dir\RGA-RGP-JOB1\sqlCI.proj" (default target) (1) -> 21-Jun-2013 11:35:19 (sqlCI target) -> 21-Jun-2013 11:35:19 EXEC : sqlCI error occurred: RedGate.Deploy.SqlServerDbPackage.Shared.Exceptions.InvalidSqlException: Test Case Summary: 1 test case(s) executed, 0 succeeded, 1 failed, 0 errored. [C:\Users\Administrator\bamboo-home\xml-data\build-dir\RGA-RGP-JOB1\sqlCI.proj] 21-Jun-2013 11:35:19 EXEC : sqlCI error occurred: [MyChecks].[test Check Email Addresses] failed: [C:\Users\Administrator\bamboo-home\xml-data\build-dir\RGA-RGP-JOB1\sqlCI.proj] 21-Jun-2013 11:35:19 EXEC : sqlCI error occurred: ringo.starr@beatles [C:\Users\Administrator\bamboo-home\xml-data\build-dir\RGA-RGP-JOB1\sqlCI.proj] 21-Jun-2013 11:35:19 EXEC : sqlCI error occurred: [C:\Users\Administrator\bamboo-home\xml-data\build-dir\RGA-RGP-JOB1\sqlCI.proj] 21-Jun-2013 11:35:19 EXEC : sqlCI error occurred: +----------------------+ [C:\Users\Administrator\bamboo-home\xml-data\build-dir\RGA-RGP-JOB1\sqlCI.proj] 21-Jun-2013 11:35:19 EXEC : sqlCI error occurred: |Test Execution Summary| [C:\Users\Administrator\bamboo-home\xml-data\build-dir\RGA-RGP-JOB1\sqlCI.proj]   As a final check, we should make sure that, if we now fix this error, the build succeeds. So in SSMS, I’m going to correct the invalid email address, then check this change in to SQL Source Control (with a comment), commit to GitHub, and re-run the build:   This should have fixed the build: It worked! Summary This has been a very quick run through the implementation of CI for databases, including tSQLt tests to test whether your database updates are working. The next post in this series will focus on automated deployment – we’ve tested our database changes, how can we now deploy these to target sites?  

    Read the article

  • GRUB 2 problem after Mac OS X update

    - by vallllll
    I have a MacBook Pro in dual boot Mac OS X / Ubuntu 12.04 (Precise Pangolin). When I boot it I have a rEFIt menu, and I can chose between Mac OS X and Linux. A few days ago I have updated Mac OS X from 10.7 (Lion) to 10.8 (Mountain Lion) using a .dmg image provided by my company. Since then when I select Linux in rEFIt it says: No bootable device --insert boot disk and press any key I have tried going to rEFIt partitioning tool. This is what I got: As suggested in Mac OSX Mavericks update rEFIT broken I wanted to fix the issue the same way as AndrewM, but I don't have the option "MBR table must be updated". Then I booted on Ubuntu 12.04 CD, chose repair broken system, chose root patition /dev/sda6 as this is where my Ubuntu file system is. I got a shell, but I don't really know how to repair the poblem since if it was just Windows dual boot. A GRUB update would solve the issue, but here I don't know where the GRUB 2 is installed. Here are results from Parted, and it is a bit confusing for me as the Mac partition is the one with boot: As you can see the entry 1 is an EFI system partition and is the boot partition, so I wonder if I should install GRUB there or in sda6, which is the Ubuntu filesystem. I am not sure should I work on rEFIt shell or Ubuntu. Unfortunately, I don't remember where GRUB was before update. UPDATE: using same link above I have tried RoundSparrow hilltx answer and installed rEFInd, but the result is same.... still no bootable device when I select Linux. UPDATE 2: just used alternate CD again, mounted on /dev/sda6 and the ran update-grub. It seemed to wok and started listing all my kernels. But after rebooting several times still no bootable device when I select Linux in rEFInd. UDATE 3: Have tried to boot from Ubuntu cd and select "boot from first available filesystem. I got error and dropped to grub rescue shell. I even followed the indications on this link but was unable to boot as I tried to use sdb6 but no luck UPDATE 4 as per Rob Smith request here is out put from ls -l $(find /EFI -iname "*.efi") *MACOSX -rw-r--r--@ 1 root admin 55048 29 oct 17:44 /EFI/refind/drivers_x64/btrfs_x64.efi -rw-r--r--@ 1 root admin 38888 29 oct 17:44 /EFI/refind/drivers_x64/ext2_x64.efi -rw-r--r--@ 1 root admin 39304 29 oct 17:44 /EFI/refind/drivers_x64/ext4_x64.efi -rw-r--r--@ 1 root admin 43432 29 oct 17:44 /EFI/refind/drivers_x64/hfs_x64.efi -rw-r--r--@ 1 root admin 38984 29 oct 17:44 /EFI/refind/drivers_x64/iso9660_x64.efi -rw-r--r--@ 1 root admin 43656 29 oct 17:44 /EFI/refind/drivers_x64/reiserfs_x64.efi -rw-r--r--@ 1 root admin 175016 29 oct 17:44 /EFI/refind/refind_x64.efi -rw-rw-r-- 1 root admin 73232 7 mar 2010 /EFI/tools/dbounce.efi -rw-rw-r-- 1 root admin 763248 7 mar 2010 /EFI/tools/dhclient.efi -rw-rw-r-- 1 root admin 67024 7 mar 2010 /EFI/tools/drawbox.efi -rw-rw-r-- 1 root admin 71312 7 mar 2010 /EFI/tools/dumpfv.efi -rw-rw-r-- 1 root admin 84848 7 mar 2010 /EFI/tools/dumpprot.efi -rw-rw-r-- 1 root admin 472912 7 mar 2010 /EFI/tools/ed.efi -rw-rw-r-- 1 root admin 143856 7 mar 2010 /EFI/tools/edit.efi -rw-rw-r-- 1 root admin 1801008 7 mar 2010 /EFI/tools/ftp.efi -rw-r--r--@ 1 root admin 47848 29 oct 17:44 /EFI/tools/gptsync_x64.efi -rw-rw-r-- 1 root admin 320560 7 mar 2010 /EFI/tools/hexdump.efi -rw-rw-r-- 1 root admin 286384 7 mar 2010 /EFI/tools/hostname.efi -rw-rw-r-- 1 root admin 534416 7 mar 2010 /EFI/tools/ifconfig.efi -rw-rw-r-- 1 root admin 395344 7 mar 2010 /EFI/tools/loadarg.efi -rw-rw-r-- 1 root admin 587408 7 mar 2010 /EFI/tools/ping.efi -rw-rw-r-- 1 root admin 730416 7 mar 2010 /EFI/tools/pppd.efi -rw-rw-r-- 1 root admin 561360 7 mar 2010 /EFI/tools/route.efi -rw-rw-r-- 1 root admin 1961712 7 mar 2010 /EFI/tools/shell.efi -rw-rw-r-- 1 root admin 750224 7 mar 2010 /EFI/tools/tcpipv4.efi -rw-rw-r-- 1 root admin 4048 7 mar 2010 /EFI/tools/textmode.efi -rw-rw-r-- 1 root admin 320656 7 mar 2010 /EFI/tools/which.efi *LINUX

    Read the article

  • Google+ Platform Office Hours: A Movember of Metro-style Apps!

    Google+ Platform Office Hours: A Movember of Metro-style Apps! This week join Google+ Developer Relations team members Joanna Smith, Jonathan Beri, Silvano Luciani, and Gus Class for a special Movember GDL. We'll share updates for Google+, demonstrate Google+ Metro style apps integration in C#, and answer any questions you ask in the event and live YouTube comments. From: GoogleDevelopers Views: 0 0 ratings Time: 30:00 More in Science & Technology

    Read the article

  • [JP ???] Google Cast SDK ??

    [JP 日本語] Google Cast SDK 概要 Chromecast 向けアプリケーションを開発される皆さんのために Google Cast SDK が公開されました。このエピソードでは、Google Cast SDK の概要についてお話... From: Google Developers Views: 301 10 ratings Time: 05:17 More in Science & Technology

    Read the article

  • Make your code gooder with the goodies gem

    - by kerry
    I have decided to publish all my Ruby tools via a gem called ‘goodies’.  To install this gem simply type ‘gem install goodies’. The source is hosted on GitHub.  The first version (0.1) has the Hash object accessors and the String file path utility methods discussed in the previous two posts. Enjoy!   Ruby Goodies @ GitHub Goodies on gemcutter.org

    Read the article

  • Tool to identify (and remove) unnecessary website files?

    - by xanadont
    Inevitably I'll stop using an antiquated css, script, or image file. Especially when a separate designer is tinkering with things and testing out a few versions of images. Before I build one myself, are there any tools out there that will drill through a website and list unlinked files? Specifically, I'm interested in ASP.NET MVC sites, so detecting calls to (and among many other things) @Url.Content(...) is important.

    Read the article

  • SEO - Google and link cleaning / cloaking [closed]

    - by Jens Törnell
    Possible Duplicate: Does the Google spider render JavaScript? This a SEO related question, not a code related one. Googles own link cleaning / cloaking Gå to http://www.google.com and search for something. Hover the title and you will se a link to the page you want to go to. The URL you see when hovering is NOT the link you are clicking on. Instead of clicking you can drag the title a little bit and then hover it. Then you will se the real URL. My own link cleaning / cloaking Go to http://jsfiddle.net/NvmER/1/ and click the link, or look at the code below. You will be "redirected" to http://www.test.com. The real link are http://www.test.com/?event=23 Working code in case jsfiddle don't work If you need to se how it works I pasted a code below. <a class="direct" href="http://www.test.com/?event=23" data-redirect="http://www.test.com">Länk</a>? $(document).ready(function() { $("a.direct").live("mousedown", function(e){ var oldurl = $(this).attr('href'); var newurl = $(this).attr('data-redirect'); $(this).attr('href', newurl); }); });? Question Is this ok with Google? It's done with javascript. If you have an answer, link to a source or test to support it.

    Read the article

  • Disable YouTube Comments while using Chrome

    - by Asian Angel
    Are you tired of the comments at YouTube being full of profanity or irritating to look at? Now you can watch videos minus the comments with the No YouTube Comments extension for Google Chrome. Before Sometimes reading through the comments at YouTube can be interesting or even fun but at other times they can be a bother to you. It all depends on the contents of the comments (i.e. lots of profanity) and what you would like to have visible while watching videos. After Once you have installed the extension all that you will need to do is refresh the page (if you were already watching a video). As you can see any and all signs of the comments section is gone. All nice and clean now… Conclusion If you love to check out your favorite videos on YouTube but are annoyed by the multitude of meaningless and profane comments, this extension for Chrome gets the job done. Links Download the No YouTube Comments extension (Google Chrome Extensions) Similar Articles Productive Geek Tips Remove Unsuitable Comments from YouTubeImprove YouTube Video Viewing in Google ChromeGain Access to a Search Box in Google ChromeQuick Tip: Disable Favicons in FirefoxDisable Favorite Links Panel in Windows Vista Explorer TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 PCmover Professional Convert the Quick Launch Bar into a Super Application Launcher Automate Tasks in Linux with Crontab Discover New Bundled Feeds in Google Reader Play Music in Chrome by Simply Dragging a File 15 Great Illustrations by Chow Hon Lam Easily Sync Files & Folders with Friends & Family

    Read the article

  • How to Create Custom SharePoint Workflows in Visual Studio 2008

    Whereas simple workflows are possible using Microsoft Office SharePoint Designer, you will soon reach the point where you will need to use Visual Studio. In the third article in Charles' introduction to Workflows in Sharepoint, he demonstrates how to create a workflow from scratch using Visual Studio, and discusses the relative merits of the two tools for this sort of development work.

    Read the article

  • How to Setup Your Verizon FIOS Router with OpenDNS or Google DNS

    - by The Geek
    Are you still using your service provider’s DNS servers? You might have heard about Comcast’s DNS servers dying and taking down the internet for anybody not using the more reliable OpenDNS or Google DNS. Here’s how to set it up on your Verizon FIOS router for every device on your network. There’s lots of other reasons to use OpenDNS or Google DNS other than just their rock-solid reliability—they are often much faster than your ISP’s DNS server, and in the case of OpenDNS, there’s loads of extra features like content filtering, typo correction, anti-phishing, and child protection controls. If you’re using Windows, be sure and check out some of our other articles on the subject: Speed Up Your Web Browsing with Google Public DNS Easily Add OpenDNS To Your Router Protect Your Kids Online Using Open DNS Otherwise, keep reading for how to set it up on your router. Latest Features How-To Geek ETC The Complete List of iPad Tips, Tricks, and Tutorials The 50 Best Registry Hacks that Make Windows Better The How-To Geek Holiday Gift Guide (Geeky Stuff We Like) LCD? LED? Plasma? The How-To Geek Guide to HDTV Technology The How-To Geek Guide to Learning Photoshop, Part 8: Filters Improve Digital Photography by Calibrating Your Monitor Deathwing the Destroyer – WoW Cataclysm Dragon Wallpaper Drag2Up Lets You Drag and Drop Files to the Web With Ease The Spam Police Parts 1 and 2 – Goodbye Spammers [Videos] Snow Angels Theme for Windows 7 Exploring the Jungle Ruins Wallpaper Protect Your Privacy When Browsing with Chrome and Iron Browser

    Read the article

  • Five geeky things you must do with your Android Smartphone

    - by Gopinath
    Android is the Windows of next generation. Its open, free, widely adopted and smart enough to outsmart Apple’s iOS. It’s a stolen product and cheap imitation of iOS, but Steve Job’s once quoted saying good artists copy and great artists steal. Alright, this post is not about Android vs iOS or is it really stolen or not. Android is a great OS for mobile devices and it lets you do amazing through mobiles.  In this post I want to write about the geeky things we can do with an Android Smartphone. Control your computer using mobile Assume that it is a lazy weekend and you are on a couch watching movies on a laptop which is a meter away. Now you want to adjust volume or skip a scene/song. How to control your laptop without moving out of couch? Just install Universal Remote free app on your smartphone and start control your computer using phone. Universal Remove app controls computers over Wifi or Bluetooth networks with dedicated remote controls for various media players and applications like YouTube, VLC & Spotify.  The application is very easy to use and works amazingly well in controlling computers. Few of the remote controls provided in the app are – Mouse, Keyboard, Media Controls, Power, Start, Windows Media Player, VLC Player,  YouTube. There is also paid version of this app with additional remotes, but for most of the users Free version is good enough. Stream YouTube videos playing on you mobile to computer You can stream YouTube videos playing on your mobile to computer/smart tv. This is something similar to Apple’s most popular AirPlay feature, but works only with YouTube videos. To start streaming videos install Google’s YouTube Remote on your smartphone, open youtube.com/leanback on your computer  and pair up mobile with computer. Once the pairing is done, videos played on YouTube Remote app will be streamed on to your computer. Access your mobile using any web browser – send/receive SMS, view photos/call logs, etc. Want to control your mobile phone using a computer? Install AirDroid app on your phone and start controlling your phone using computer browser – send and receive messages, view call logs, play music, upload/download files, edit contacts and many more. At times it’s lot of fun to access mobile using a big screen devices like laptops. Launch a webpage on your mobile browser using your computer With Google Chrome to Phone installed on your computer and mobile, you can send links and other information from Chrome browser to your Android device. With a click on Chrome browser, the current webpage of Chrome browser will be automatically launched on Android device. This is very handy when you want to send links, send driving direction to mobile using Google Maps and launch phone dialer with number selected on webpage. Install Apps on mobile using computer To install apps on your smartphone you really don’t need to touch it. Open any web browser, sing in to Google Play with your Google id that is associated with smartphone and start installing apps on to your phone right from the browser. As you browse apps on Google Play store, you find Install button and all you need to do is to just click Install. Google will automatically installs app on your mobile within few seconds.

    Read the article

  • Crawling an ajax based page with both a hash fragment and a meta tag

    - by Christofian
    According to google's documentation on crawling ajax based web pages, if a url contains a hash fragment, or something at the end of an url that looks like #helloworld, and if there is an ! after the #, as in #!helloworld, google will then request the url url?_escaped_fragment_=helloworld. I currently have an ajax based webpage that I want google to be able to crawl. Sometimes, the page uses hash fragments, and for those situations I set up the server so it will return an html snapshot for that page using _escaped_fragment_. However, that webpage often does not load a hash fragment, and when that happens the webpage still loads content using ajax. I couldn't find a good solution to enable ajax crawling for pages that sometimes have a hash fragment and sometimes don't. How can I tell google to use _escaped_fragment_ when there is a hash fragment, and to use something else to get an html snapshot of a page when there isn't a hash fragment?

    Read the article

  • Web Site Performance and Assembly Versioning

    - by capgpilk
    I originally wanted to write this post in one, but there is quite a large amount of information which can be broken down into different areas, so I am going to publish it in three posts. Minification and Concatination of JavaScript and CSS Files – this post Versioning Combined Files Using Subversion – published shortly Versioning Combined Files Using Mercurial – published shortly Website Performance There are many ways to improve web site performance, two areas are reducing the amount of data that is served up from the web server and reducing the number of files that are requested. Here I will outline the process of minimizing and concatenating your javascript and css files automatically at build time of your visual studio web site/ application. To edit the project file in Visual Studio, you need to first unload it by right clicking the project in Solution Explorer. I prefer to do this in a third party tool such as Notepad++ and save it there forcing VS to reload it each time I make a change as the whole process in Visual Studio can be a bit tedious. Now you have the project file, you will notice that it is an MSBuild project file. I am going to use a fantastic utility from Microsoft called Ajax Minifier. This tool minifies both javascript and css. 1. Import the tasks for AjaxMin choosing the location you installed to. I keep all third party utilities in a Tools directory within my solution structure and source control. This way I know I can get the entire solution from source control without worrying about what other tools I need to get the project to build locally. 1: <Import Project="..\Tools\MicrosoftAjaxMinifier\AjaxMin.tasks" /> 2. Now create ItemGroups for all your js and css files like this. Separating out your non minified files and minified files. This can go in the AfterBuild container. 1: <Target Name="AfterBuild"> 2:  3: <!-- Javascript files that need minimizing --> 4: <ItemGroup> 5: <JSMin Include="Scripts\jqModal.js" /> 6: <JSMin Include="Scripts\jquery.jcarousel.js" /> 7: <JSMin Include="Scripts\shadowbox.js" /> 8: </ItemGroup> 9: <!-- CSS files that need minimizing --> 10: <ItemGroup> 11: <CSSMin Include="Content\Site.css" /> 12: <CSSMin Include="Content\themes\base\jquery-ui.css" /> 13: <CSSMin Include="Content\shadowbox.css" /> 14: </ItemGroup>   1: <!-- Javascript files to combine --> 2: <ItemGroup> 3: <JSCat Include="Scripts\jqModal.min.js" /> 4: <JSCat Include="Scripts\jquery.jcarousel.min.js" /> 5: <JSCat Include="Scripts\shadowbox.min.js" /> 6: </ItemGroup> 7: <!-- CSS files to combine --> 8: <ItemGroup> 9: <CSSCat Include="Content\Site.min.css" /> 10: <CSSCat Include="Content\themes\base\jquery-ui.min.css" /> 11: <CSSCat Include="Content\shadowbox.min.css" /> 12: </ItemGroup>   3. Call AjaxMin to do the crunching. 1: <Message Text="Minimizing JS and CSS Files..." Importance="High" /> 2: <AjaxMin JsSourceFiles="@(JSMin)" JsSourceExtensionPattern="\.js$" 3: JsTargetExtension=".min.js" JsEvalTreatment="MakeImmediateSafe" 4: CssSourceFiles="@(CSSMin)" CssSourceExtensionPattern="\.css$" 5: CssTargetExtension=".min.css" /> This will create the *.min.css and *.min.js files in the same directory the original files were. 4. Now concatenate the minified files into one for javascript and another for css. Here we write out the files with a default file name. In later posts I will cover versioning these files the same as your project assembly again to help performance. 1: <Message Text="Concat JS Files..." Importance="High" /> 2: <ReadLinesFromFile File="%(JSCat.Identity)"> 3: <Output TaskParameter="Lines" ItemName="JSLinesSite" /> 4: </ReadLinesFromFile> 5: <WriteLinestoFile File="Scripts\site-script.combined.min.js" Lines="@(JSLinesSite)" 6: Overwrite="true" /> 7: <Message Text="Concat CSS Files..." Importance="High" /> 8: <ReadLinesFromFile File="%(CSSCat.Identity)"> 9: <Output TaskParameter="Lines" ItemName="CSSLinesSite" /> 10: </ReadLinesFromFile> 11: <WriteLinestoFile File="Content\site-style.combined.min.css" Lines="@(CSSLinesSite)" 12: Overwrite="true" /> 5. Save the project file, if you have Visual Studio open it will ask you to reload the project. You can now run a build and these minified and combined files will be created automatically. 6. Finally reference these minified combined files in your web page. In the next two posts I will cover versioning these files to match your assembly.

    Read the article

  • Need a Better Cleanup Tool [Humorous Image]

    - by Asian Angel
    Obviously some cleanup tools work better than others…and sometimes common sense cleanup is the best tool of all! Note: Notice the timeline in the image… View the Full-Size Version of the Image Got a sales guys laptop back… Note the times. [via Fail Desk] The Best Free Portable Apps for Your Flash Drive Toolkit How to Own Your Own Website (Even If You Can’t Build One) Pt 3 How to Sync Your Media Across Your Entire House with XBMC

    Read the article

  • Una de codecs, HTML5 y navegadores

    - by Eugenio Estrada
    Quien haya estado un pelín atento a las noticias tecnológicas, sabrá que esta semana Google en su conferencia Google IO ha publicado su códec de vídeo VP8 para entrar en la guerra de los codecs en los navegadores. Este códec lo ha liberado en el formato WebM que incluye el códec Ogg para el audio, todo envuelto en un contenedor Matroska (MKV para los amigos). Lo interesante es que parece que lo han optimizado para un uso mejor en Web tanto para escritorio, móvil y ahora televisión (con Google TV...(read more)

    Read the article

  • Which language and platform features really boosted your coding speed?

    - by Serge
    The question is about delivering working code faster without any regard for design, quality, maintainability, etc. Here is the list of things that help me to write and read code faster: Language: static typing, support for object-oriented and functional programming styles, embedded documentation, short compile-debug-fix cycle or REPL, automatic memory management Platform: "batteries" included (text, regex, IO, threading, networking), thriving community, tons of open-source libs Tools: IDE, visual debugger, code-completion, code navigation, refactoring

    Read the article

< Previous Page | 198 199 200 201 202 203 204 205 206 207 208 209  | Next Page >