Search Results

Search found 49616 results on 1985 pages for 'coding system'.

Page 397/1985 | < Previous Page | 393 394 395 396 397 398 399 400 401 402 403 404  | Next Page >

  • Running a simple integration scenario using the Oracle Big Data Connectors on Hadoop/HDFS cluster

    - by hamsun
    Between the elephant ( the tradional image of the Hadoop framework) and the Oracle Iron Man (Big Data..) an english setter could be seen as the link to the right data Data, Data, Data, we are living in a world where data technology based on popular applications , search engines, Webservers, rich sms messages, email clients, weather forecasts and so on, have a predominant role in our life. More and more technologies are used to analyze/track our behavior, try to detect patterns, to propose us "the best/right user experience" from the Google Ad services, to Telco companies or large consumer sites (like Amazon:) ). The more we use all these technologies, the more we generate data, and thus there is a need of huge data marts and specific hardware/software servers (as the Exadata servers) in order to treat/analyze/understand the trends and offer new services to the users. Some of these "data feeds" are raw, unstructured data, and cannot be processed effectively by normal SQL queries. Large scale distributed processing was an emerging infrastructure need and the solution seemed to be the "collocation of compute nodes with the data", which in turn leaded to MapReduce parallel patterns and the development of the Hadoop framework, which is based on MapReduce and a distributed file system (HDFS) that runs on larger clusters of rather inexpensive servers. Several Oracle products are using the distributed / aggregation pattern for data calculation ( Coherence, NoSql, times ten ) so once that you are familiar with one of these technologies, lets says with coherence aggregators, you will find the whole Hadoop, MapReduce concept very similar. Oracle Big Data Appliance is based on the Cloudera Distribution (CDH), and the Oracle Big Data Connectors can be plugged on a Hadoop cluster running the CDH distribution or equivalent Hadoop clusters. In this paper, a "lab like" implementation of this concept is done on a single Linux X64 server, running an Oracle Database 11g Enterprise Edition Release 11.2.0.4.0, and a single node Apache hadoop-1.2.1 HDFS cluster, using the SQL connector for HDFS. The whole setup is fairly simple: Install on a Linux x64 server ( or virtual box appliance) an Oracle Database 11g Enterprise Edition Release 11.2.0.4.0 server Get the Apache Hadoop distribution from: http://mir2.ovh.net/ftp.apache.org/dist/hadoop/common/hadoop-1.2.1. Get the Oracle Big Data Connectors from: http://www.oracle.com/technetwork/bdc/big-data-connectors/downloads/index.html?ssSourceSiteId=ocomen. Check the java version of your Linux server with the command: java -version java version "1.7.0_40" Java(TM) SE Runtime Environment (build 1.7.0_40-b43) Java HotSpot(TM) 64-Bit Server VM (build 24.0-b56, mixed mode) Decompress the hadoop hadoop-1.2.1.tar.gz file to /u01/hadoop-1.2.1 Modify your .bash_profile export HADOOP_HOME=/u01/hadoop-1.2.1 export PATH=$PATH:$HADOOP_HOME/bin export HIVE_HOME=/u01/hive-0.11.0 export PATH=$PATH:$HADOOP_HOME/bin:$HIVE_HOME/bin (also see my sample .bash_profile) Set up ssh trust for Hadoop process, this is a mandatory step, in our case we have to establish a "local trust" as will are using a single node configuration copy the new public keys to the list of authorized keys connect and test the ssh setup to your localhost: We will run a "pseudo-Hadoop cluster", in what is called "local standalone mode", all the Hadoop java components are running in one Java process, this is enough for our demo purposes. We need to "fine tune" some Hadoop configuration files, we have to go at our $HADOOP_HOME/conf, and modify the files: core-site.xml hdfs-site.xml mapred-site.xml check that the hadoop binaries are referenced correctly from the command line by executing: hadoop -version As Hadoop is managing our "clustered HDFS" file system we have to create "the mount point" and format it , the mount point will be declared to core-site.xml as: The layout under the /u01/hadoop-1.2.1/data will be created and used by other hadoop components (MapReduce = /mapred/...) HDFS is using the /dfs/... layout structure format the HDFS hadoop file system: Start the java components for the HDFS system As an additional check, you can use the GUI Hadoop browsers to check the content of your HDFS configurations: Once our HDFS Hadoop setup is done you can use the HDFS file system to store data ( big data : )), and plug them back and forth to Oracle Databases by the means of the Big Data Connectors ( which is the next configuration step). You can create / use a Hive db, but in our case we will make a simple integration of "raw data" , through the creation of an External Table to a local Oracle instance ( on the same Linux box, we run the Hadoop HDFS one node cluster and one Oracle DB). Download some public "big data", I use the site: http://france.meteofrance.com/france/observations, from where I can get *.csv files for my big data simulations :). Here is the data layout of my example file: Download the Big Data Connector from the OTN (oraosch-2.2.0.zip), unzip it to your local file system (see picture below) Modify your environment in order to access the connector libraries , and make the following test: [oracle@dg1 bin]$./hdfs_stream Usage: hdfs_stream locationFile [oracle@dg1 bin]$ Load the data to the Hadoop hdfs file system: hadoop fs -mkdir bgtest_data hadoop fs -put obsFrance.txt bgtest_data/obsFrance.txt hadoop fs -ls /user/oracle/bgtest_data/obsFrance.txt [oracle@dg1 bg-data-raw]$ hadoop fs -ls /user/oracle/bgtest_data/obsFrance.txt Found 1 items -rw-r--r-- 1 oracle supergroup 54103 2013-10-22 06:10 /user/oracle/bgtest_data/obsFrance.txt [oracle@dg1 bg-data-raw]$hadoop fs -ls hdfs:///user/oracle/bgtest_data/obsFrance.txt Found 1 items -rw-r--r-- 1 oracle supergroup 54103 2013-10-22 06:10 /user/oracle/bgtest_data/obsFrance.txt Check the content of the HDFS with the browser UI: Start the Oracle database, and run the following script in order to create the Oracle database user, the Oracle directories for the Oracle Big Data Connector (dg1 it’s my own db id replace accordingly yours): #!/bin/bash export ORAENV_ASK=NO export ORACLE_SID=dg1 . oraenv sqlplus /nolog <<EOF CONNECT / AS sysdba; CREATE OR REPLACE DIRECTORY osch_bin_path AS '/u01/orahdfs-2.2.0/bin'; CREATE USER BGUSER IDENTIFIED BY oracle; GRANT CREATE SESSION, CREATE TABLE TO BGUSER; GRANT EXECUTE ON sys.utl_file TO BGUSER; GRANT READ, EXECUTE ON DIRECTORY osch_bin_path TO BGUSER; CREATE OR REPLACE DIRECTORY BGT_LOG_DIR as '/u01/BG_TEST/logs'; GRANT READ, WRITE ON DIRECTORY BGT_LOG_DIR to BGUSER; CREATE OR REPLACE DIRECTORY BGT_DATA_DIR as '/u01/BG_TEST/data'; GRANT READ, WRITE ON DIRECTORY BGT_DATA_DIR to BGUSER; EOF Put the following in a file named t3.sh and make it executable, hadoop jar $OSCH_HOME/jlib/orahdfs.jar \ oracle.hadoop.exttab.ExternalTable \ -D oracle.hadoop.exttab.tableName=BGTEST_DP_XTAB \ -D oracle.hadoop.exttab.defaultDirectory=BGT_DATA_DIR \ -D oracle.hadoop.exttab.dataPaths="hdfs:///user/oracle/bgtest_data/obsFrance.txt" \ -D oracle.hadoop.exttab.columnCount=7 \ -D oracle.hadoop.connection.url=jdbc:oracle:thin:@//localhost:1521/dg1 \ -D oracle.hadoop.connection.user=BGUSER \ -D oracle.hadoop.exttab.printStackTrace=true \ -createTable --noexecute then test the creation fo the external table with it: [oracle@dg1 samples]$ ./t3.sh ./t3.sh: line 2: /u01/orahdfs-2.2.0: Is a directory Oracle SQL Connector for HDFS Release 2.2.0 - Production Copyright (c) 2011, 2013, Oracle and/or its affiliates. All rights reserved. Enter Database Password:] The create table command was not executed. The following table would be created. CREATE TABLE "BGUSER"."BGTEST_DP_XTAB" ( "C1" VARCHAR2(4000), "C2" VARCHAR2(4000), "C3" VARCHAR2(4000), "C4" VARCHAR2(4000), "C5" VARCHAR2(4000), "C6" VARCHAR2(4000), "C7" VARCHAR2(4000) ) ORGANIZATION EXTERNAL ( TYPE ORACLE_LOADER DEFAULT DIRECTORY "BGT_DATA_DIR" ACCESS PARAMETERS ( RECORDS DELIMITED BY 0X'0A' CHARACTERSET AL32UTF8 STRING SIZES ARE IN CHARACTERS PREPROCESSOR "OSCH_BIN_PATH":'hdfs_stream' FIELDS TERMINATED BY 0X'2C' MISSING FIELD VALUES ARE NULL ( "C1" CHAR(4000), "C2" CHAR(4000), "C3" CHAR(4000), "C4" CHAR(4000), "C5" CHAR(4000), "C6" CHAR(4000), "C7" CHAR(4000) ) ) LOCATION ( 'osch-20131022081035-74-1' ) ) PARALLEL REJECT LIMIT UNLIMITED; The following location files would be created. osch-20131022081035-74-1 contains 1 URI, 54103 bytes 54103 hdfs://localhost:19000/user/oracle/bgtest_data/obsFrance.txt Then remove the --noexecute flag and create the external Oracle table for the Hadoop data. Check the results: The create table command succeeded. CREATE TABLE "BGUSER"."BGTEST_DP_XTAB" ( "C1" VARCHAR2(4000), "C2" VARCHAR2(4000), "C3" VARCHAR2(4000), "C4" VARCHAR2(4000), "C5" VARCHAR2(4000), "C6" VARCHAR2(4000), "C7" VARCHAR2(4000) ) ORGANIZATION EXTERNAL ( TYPE ORACLE_LOADER DEFAULT DIRECTORY "BGT_DATA_DIR" ACCESS PARAMETERS ( RECORDS DELIMITED BY 0X'0A' CHARACTERSET AL32UTF8 STRING SIZES ARE IN CHARACTERS PREPROCESSOR "OSCH_BIN_PATH":'hdfs_stream' FIELDS TERMINATED BY 0X'2C' MISSING FIELD VALUES ARE NULL ( "C1" CHAR(4000), "C2" CHAR(4000), "C3" CHAR(4000), "C4" CHAR(4000), "C5" CHAR(4000), "C6" CHAR(4000), "C7" CHAR(4000) ) ) LOCATION ( 'osch-20131022081719-3239-1' ) ) PARALLEL REJECT LIMIT UNLIMITED; The following location files were created. osch-20131022081719-3239-1 contains 1 URI, 54103 bytes 54103 hdfs://localhost:19000/user/oracle/bgtest_data/obsFrance.txt This is the view from the SQL Developer: and finally the number of lines in the oracle table, imported from our Hadoop HDFS cluster SQL select count(*) from "BGUSER"."BGTEST_DP_XTAB"; COUNT(*) ---------- 1151 In a next post we will integrate data from a Hive database, and try some ODI integrations with the ODI Big Data connector. Our simplistic approach is just a step to show you how these unstructured data world can be integrated to Oracle infrastructure. Hadoop, BigData, NoSql are great technologies, they are widely used and Oracle is offering a large integration infrastructure based on these services. Oracle University presents a complete curriculum on all the Oracle related technologies: NoSQL: Introduction to Oracle NoSQL Database Using Oracle NoSQL Database Big Data: Introduction to Big Data Oracle Big Data Essentials Oracle Big Data Overview Oracle Data Integrator: Oracle Data Integrator 12c: New Features Oracle Data Integrator 11g: Integration and Administration Oracle Data Integrator: Administration and Development Oracle Data Integrator 11g: Advanced Integration and Development Oracle Coherence 12c: Oracle Coherence 12c: New Features Oracle Coherence 12c: Share and Manage Data in Clusters Oracle Coherence 12c: Oracle GoldenGate 11g: Fundamentals for Oracle Oracle GoldenGate 11g: Fundamentals for SQL Server Oracle GoldenGate 11g Fundamentals for Oracle Oracle GoldenGate 11g Fundamentals for DB2 Oracle GoldenGate 11g Fundamentals for Teradata Oracle GoldenGate 11g Fundamentals for HP NonStop Oracle GoldenGate 11g Management Pack: Overview Oracle GoldenGate 11g Troubleshooting and Tuning Oracle GoldenGate 11g: Advanced Configuration for Oracle Other Resources: Apache Hadoop : http://hadoop.apache.org/ is the homepage for these technologies. "Hadoop Definitive Guide 3rdEdition" by Tom White is a classical lecture for people who want to know more about Hadoop , and some active "googling " will also give you some more references. About the author: Eugene Simos is based in France and joined Oracle through the BEA-Weblogic Acquisition, where he worked for the Professional Service, Support, end Education for major accounts across the EMEA Region. He worked in the banking sector, ATT, Telco companies giving him extensive experience on production environments. Eugen currently specializes in Oracle Fusion Middleware teaching an array of courses on Weblogic/Webcenter, Content,BPM /SOA/Identity-Security/GoldenGate/Virtualisation/Unified Comm Suite) throughout the EMEA region.

    Read the article

  • CodePlex Daily Summary for Wednesday, March 31, 2010

    CodePlex Daily Summary for Wednesday, March 31, 2010New ProjectsBase Class Libraries: The Base Class Libraries site hosts samples, previews, and prototypes from the BCL team. BB Scheduler - BroadBand Scheduler: Broadband Scheduler is highly useful as it helps the user to set the time when the computer will automatically enable the Broadband (Internet) conn...BFBC2 PRoCon: BFBC2 PRoCon makes it easier for Bad Company 2 GSP's and private server owners to administer their BFBC2 servers. It's developed in C# and targete...Business Process Management Virtual Environment (BPMVE): This BPMVE project has been separated into 3 different projects. BPMVE_DataStructure project contains all data structures (classes and attribute...Business Rule Engine BizUnit Test Steps: Business Rule Engine BizUnit Test StepsCint: CintContent Edit Extender for Ajax.Net: The Content Edit Extender is an Ajax.Net control extender that allows in-place editing of a div tag (panel). Double-click to edit, hit enter or tab...COV Game: Cov game is a worms like game made on Silverlight with Python server.Cybera: A continuing development project of the existing but now generally inactive former Cybera project.DotNetCRM Community Edition: DotNetCRM Community Edition is an open source, enterprise class CRM built on the .NET platform. It is designed to be extensible, configurable, data...EAV: A sample EAV pattern for SQL Server with: Tables and indexes Partial referential integrity Partial data typing Updatable views (like normal SQL table)EditRegion.MVC: EditRegion.MVC may be all you want, when you do not want a full CMS. It allows html areas to be edited by nominated users or roles. The API follo...Firestarter Modeller: Firestarter ModellerHabanero.Testability: Habanero.TestabilityProSoft CMS: CMS System - scalable over an undeclared amount of servers - publishing services - version control of sitesPS-Blog.net: This is my first project here on codeplex. I would like to write my own blog software. Any comments or critcs are welcome.ReleaseMe: ReleaseMe is a simple little tool I use to copy websites, and custom Window Services, from my development machine to a specified production machin...SAAS-RD: SAAS-RD: uma ferrameta utilizada para prover integração de SaaS com aplicações externasSample Web Application using Telerik's OpenAccess ORM: Sample Web Site Application Project that uses Telerik's OpenAccess ORM for data access.Sistema Facturacion: En el proyecto de Sistema de Facturacion se desarrollara una aplicacion para el total control de un establecimiento comercial Smooth Habanero: Smooth HabaneroSouthEast Conference 2011: For the Florida Institute of Technology IEEE Chapter regarding the Southeast Hardware Conference held in Nashville, TN 2011.SQL Server Bible Standards: A SQL Server Design and Development standards document. SSAS Profiler Trace Scheduler: AS Profiler Scheduler is a tool that will enable Scheduling of SQL AS Tracing using predefined Profiler Templates. For tracking different issues th...Symbolic Algebra: Another attempt to make an algebric system but to be natively in C# under the .net framework. Theocratic Ministry School System: This is an Open Source Theocratic Ministry School System designed for Jehovah's Witnesses. It will include much of the same features as the TMS ver...Weather Report WebControls: The First Release Version:1.0.10330.2334WPF 3D Labyrinth: A project for "Design and Analysis of Computer Algorithms" subject at Kaunas University of Technology. Building a 3D labyrinth with a figure which ...WPF Zen Garden: This is intended to be a gallery for WPF Style sheets in the form of Css Zen Garden. New ReleasesAPSales CRM - Software as a Service: APSales 0.1.3: This version add some interesting features to the project: Implement "Filter By Additional Fields" in view edit Implement quick create function Im...Base Class Libraries: BigRational: BigRational builds on the BigInteger introduced in .NET Framework 4 to create an arbitrary-precision rational number type. A rational number is a ...Base Class Libraries: Long Path: The long path wrapper provides functionality to make it easier to work with paths that are longer than the current 259 character limit of the Syste...Base Class Libraries: PerfMonitor: PerfMonitor is a command-line tool for profiling the system using Event Tracing for Windows (ETW). PerfMonitor is built on top of the TraceEvent li...Base Class Libraries: TraceEvent: TraceEvent is an experimental library that greatly simplifies reading Event Tracing for Windows (ETW) events. It is used by the PerfMonitor tool. ...BB Scheduler - BroadBand Scheduler: Broadband Scheduler v2.0: - Broadband service has some of the cheap and best monthly plans for the users all over the nation. And some of the plans include unlimited night d...BuildTools - Toolset for automated builds: BuildTools 2.0 Mar 2010 Milestone: The Mar 2010 Milestone release is a contains a bug fixes for projects not explicitly setting the StartingDate property, and no longer breaks when t...Business Rule Engine BizUnit Test Steps: BRE BizUnit Test Steps Ver. 1.0: Version 1.0Claymore MVP: Claymore 1.1.0.0: Changelog Added Compact Framework support Added fluent interface to configure the library.Content Edit Extender for Ajax.Net: ContentEditExtender 1.0 for Ajax.Net: Complete with source control and test/example Website and Web Service. Built with Visual Studio 2008 with the 3.5 BCL. Control requires the AjaxCon...dylan.NET: dylan.NET v. 9.6: This stable version of the compiler for both .NET 3.5 and 4.0 adds the loading of numbers in a bult-in fashion. See code below: #refasm mscorlib...EAV: March 2010: The initial release as demoed at the SSWUG Virtual Conference Spring 2010Fax .NET: Fax .NET 1.0.1: FIX : bugs for x64 and WOW64 architecture. The zip file include : Binary file Demo executable file Help fileFluent Ribbon Control Suite: Fluent Ribbon Control Suite 1.0 for NET 4.0 RC: Includes: Fluent.dll (with .pdb and .xml) compiled with .NET 4.0 RC Test application compiled with .NET 4.0 RC SourcesIceChat: IceChat 2009 Alpha 12.1 Full Install: Build Alpha 12.1 - March 30 2010 Fix Nick Name Change for Tabs/Server List for Queries Fix for running Channel List Multiple Times, clears list n...Import Excel data to SharePoint List: Import Data from Spreadsheet to SP List V1.5 x64: Import from Spreadsheet to a SharePoint List is the missing facet to the WSS 3.0 / MOSS 2007 List features. SharePoint lets a user create a custom...LINQ to Twitter: LINQ to Twitter Beta v2.0.9: New items added since v1.1 include: Support for OAuth (via DotNetOpenAuth), secure communication via https, VB language support, serialization of ...mojoPortal: 2.3.4.1: see release notes on mojoportal.com http://www.mojoportal.com/mojoportal-2341-released.aspxocculo: test: Release build for testers.PowerShell ToodleDo Module: 0.1: Initial Development Release - Very rough build.Quick Performance Monitor: Version 1.2: Fixed issue where app crash when performance counter disappear or becomes unavailable while the application is running. For now the exception is si...Quick Performance Monitor: Version 1.3: Add 'view last error'Rule 18 - Love your clipboard: Rule 18 (Visual Studio 2010 + .NET 4 RC Version): This is the second public beta for the first version of Rule 18. It has had a extensive private beta and been used in big presentations since the ...Selection Maker: Selection Maker 1.5: New Features:If the source folder does not exist,a dialog box will appear and ask the user if he/she wants to create that folder or if select anoth...sPATCH: sPatch v0.9a: + Fixed: wrong path to elementclient.exeSQL Server Bible Standards: March 2010: Initial release as presented at SSWUG Virtual Conference Spring 2010Survey - web survey & form engine: Source Code Documentation: Documentation.chm file as published with Nsurvey v. 1.9.1 - april 2005 Basic technical documentation and description of source code for developers...Theocratic Ministry School System: Theocratic Ministry School System - TMSS: This is the first release of TMSS. It is far from complete but demonstrates the possiablities of what can be done with Access 2007 using developer ...Weather Report WebControls: WeatherReport Controls: 本下载包含一个已经经过编译的二进制运行库和一个测试的WebApplication项目,是2010年3月30日发布的Most Popular ProjectsRawrWBFS ManagerASP.NET Ajax LibraryMicrosoft SQL Server Product Samples: DatabaseSilverlight ToolkitAJAX Control ToolkitWindows Presentation Foundation (WPF)LiveUpload to FacebookASP.NETMicrosoft SQL Server Community & SamplesMost Active ProjectsRawrjQuery Library for SharePoint Web ServicesBase Class LibrariesBlogEngine.NETManaged Extensibility FrameworkFarseer Physics EngineGraffiti CMSMicrosoft Biology FoundationLINQ to Twitterpatterns & practices – Enterprise Library

    Read the article

  • Time management and self improvement

    - by Filip
    Hi, I hope I can open a discussion on this topic as this is not a specific problem. It's a topic I hope to get some ideas on how people in similar situation as mine manage their time. OK, I'm a single developer on a software project for the last 6-8 months. The project I'm working on uses several technologies, mainly .net stuff: WPF, WF, NHibernate, WCF, MySql and other third party SDKs relevant for the project nature. My experience and knowledge vary, for example I have a lot of experience in WPF but much less in WCF. I work full time on the project and im curios on how other programmers which need to multi task in many areas manage their time. I'm a very applied type of person and prefer to code instead of doing research. I feel that doing research "might" slow down the progress of the project while I recognize that research and learning more in areas which I'm not so strong will ultimately make me more productive. How would you split up your daily time in productive coding time and time to and experiment, read blogs, go through tutorials etc. I would say that Im coding about 90%+ of my day and devoting some but very little time in research and acquiring new knowledge.

    Read the article

  • What's up with LDoms: Part 1 - Introduction & Basic Concepts

    - by Stefan Hinker
    LDoms - the correct name is Oracle VM Server for SPARC - have been around for quite a while now.  But to my surprise, I get more and more requests to explain how they work or to give advise on how to make good use of them.  This made me think that writing up a few articles discussing the different features would be a good idea.  Now - I don't intend to rewrite the LDoms Admin Guide or to copy and reformat the (hopefully) well known "Beginners Guide to LDoms" by Tony Shoumack from 2007.  Those documents are very recommendable - especially the Beginners Guide, although based on LDoms 1.0, is still a good place to begin with.  However, LDoms have come a long way since then, and I hope to contribute to their adoption by discussing how they work and what features there are today.  In this and the following posts, I will use the term "LDoms" as a common abbreviation for Oracle VM Server for SPARC, just because it's a lot shorter and easier to type (and presumably, read). So, just to get everyone on the same baseline, lets briefly discuss the basic concepts of virtualization with LDoms.  LDoms make use of a hypervisor as a layer of abstraction between real, physical hardware and virtual hardware.  This virtual hardware is then used to create a number of guest systems which each behave very similar to a system running on bare metal:  Each has its own OBP, each will install its own copy of the Solaris OS and each will see a certain amount of CPU, memory, disk and network resources available to it.  Unlike some other type 1 hypervisors running on x86 hardware, the SPARC hypervisor is embedded in the system firmware and makes use both of supporting functions in the sun4v SPARC instruction set as well as the overall CPU architecture to fulfill its function. The CMT architecture of the supporting CPUs (T1 through T4) provide a large number of cores and threads to the OS.  For example, the current T4 CPU has eight cores, each running 8 threads, for a total of 64 threads per socket.  To the OS, this looks like 64 CPUs.  The SPARC hypervisor, when creating guest systems, simply assigns a certain number of these threads exclusively to one guest, thus avoiding the overhead of having to schedule OS threads to CPUs, as do typical x86 hypervisors.  The hypervisor only assigns CPUs and then steps aside.  It is not involved in the actual work being dispatched from the OS to the CPU, all it does is maintain isolation between different guests. Likewise, memory is assigned exclusively to individual guests.  Here,  the hypervisor provides generic mappings between the physical hardware addresses and the guest's views on memory.  Again, the hypervisor is not involved in the actual memory access, it only maintains isolation between guests. During the inital setup of a system with LDoms, you start with one special domain, called the Control Domain.  Initially, this domain owns all the hardware available in the system, including all CPUs, all RAM and all IO resources.  If you'd be running the system un-virtualized, this would be what you'd be working with.  To allow for guests, you first resize this initial domain (also called a primary domain in LDoms speak), assigning it a small amount of CPU and memory.  This frees up most of the available CPU and memory resources for guest domains.  IO is a little more complex, but very straightforward.  When LDoms 1.0 first came out, the only way to provide IO to guest systems was to create virtual disk and network services and attach guests to these services.  In the meantime, several different ways to connect guest domains to IO have been developed, the most recent one being SR-IOV support for network devices released in version 2.2 of Oracle VM Server for SPARC. I will cover these more advanced features in detail later.  For now, lets have a short look at the initial way IO was virtualized in LDoms: For virtualized IO, you create two services, one "Virtual Disk Service" or vds, and one "Virtual Switch" or vswitch.  You can, of course, also create more of these, but that's more advanced than I want to cover in this introduction.  These IO services now connect real, physical IO resources like a disk LUN or a networt port to the virtual devices that are assigned to guest domains.  For disk IO, the normal case would be to connect a physical LUN (or some other storage option that I'll discuss later) to one specific guest.  That guest would be assigned a virtual disk, which would appear to be just like a real LUN to the guest, while the IO is actually routed through the virtual disk service down to the physical device.  For network, the vswitch acts very much like a real, physical ethernet switch - you connect one physical port to it for outside connectivity and define one or more connections per guest, just like you would plug cables between a real switch and a real system. For completeness, there is another service that provides console access to guest domains which mimics the behavior of serial terminal servers. The connections between the virtual devices on the guest's side and the virtual IO services in the primary domain are created by the hypervisor.  It uses so called "Logical Domain Channels" or LDCs to create point-to-point connections between all of these devices and services.  These LDCs work very similar to high speed serial connections and are configured automatically whenever the Control Domain adds or removes virtual IO. To see all this in action, now lets look at a first example.  I will start with a newly installed machine and configure the control domain so that it's ready to create guest systems. In a first step, after we've installed the software, let's start the virtual console service and downsize the primary domain.  root@sun # ldm list NAME STATE FLAGS CONS VCPU MEMORY UTIL UPTIME primary active -n-c-- UART 512 261632M 0.3% 2d 13h 58m root@sun # ldm add-vconscon port-range=5000-5100 \ primary-console primary root@sun # svcadm enable vntsd root@sun # svcs vntsd STATE STIME FMRI online 9:53:21 svc:/ldoms/vntsd:default root@sun # ldm set-vcpu 16 primary root@sun # ldm set-mau 1 primary root@sun # ldm start-reconf primary root@sun # ldm set-memory 7680m primary root@sun # ldm add-config initial root@sun # shutdown -y -g0 -i6 So what have I done: I've defined a range of ports (5000-5100) for the virtual network terminal service and then started that service.  The vnts will later provide console connections to guest systems, very much like serial NTS's do in the physical world. Next, I assigned 16 vCPUs (on this platform, a T3-4, that's two cores) to the primary domain, freeing the rest up for future guest systems.  I also assigned one MAU to this domain.  A MAU is a crypto unit in the T3 CPU.  These need to be explicitly assigned to domains, just like CPU or memory.  (This is no longer the case with T4 systems, where crypto is always available everywhere.) Before I reassigned the memory, I started what's called a "delayed reconfiguration" session.  That avoids actually doing the change right away, which would take a considerable amount of time in this case.  Instead, I'll need to reboot once I'm all done.  I've assigned 7680MB of RAM to the primary.  That's 8GB less the 512MB which the hypervisor uses for it's own private purposes.  You can, depending on your needs, work with less.  I'll spend a dedicated article on sizing, discussing the pros and cons in detail. Finally, just before the reboot, I saved my work on the ILOM, to make this configuration available after a powercycle of the box.  (It'll always be available after a simple reboot, but the ILOM needs to know the configuration of the hypervisor after a power-cycle, before the primary domain is booted.) Now, lets create a first disk service and a first virtual switch which is connected to the physical network device igb2. We will later use these to connect virtual disks and virtual network ports of our guest systems to real world storage and network. root@sun # ldm add-vds primary-vds root@sun # ldm add-vswitch net-dev=igb2 switch-primary primary You are free to choose whatever names you like for the virtual disk service and the virtual switch.  I strongly recommend that you choose names that make sense to you and describe the function of each service in the context of your implementation.  For the vswitch, for example, you could choose names like "admin-vswitch" or "production-network" etc. This already concludes the configuration of the control domain.  We've freed up considerable amounts of CPU and RAM for guest systems and created the necessary infrastructure - console, vts and vswitch - so that guests systems can actually interact with the outside world.  The system is now ready to create guests, which I'll describe in the next section. For further reading, here are some recommendable links: The LDoms 2.2 Admin Guide The "Beginners Guide to LDoms" The LDoms Information Center on MOS LDoms on OTN

    Read the article

  • CodePlex Daily Summary for Thursday, June 03, 2010

    CodePlex Daily Summary for Thursday, June 03, 2010New ProjectsAlbatross: Albatross framework. We are still working on the documentation, more details will be available soon.ApiChange: ApiChange is the Swiss army knife for inspecting your assemblies from the command line. Now you can do basic operations like diff, who uses (method...BaseCalendar: BaseCalendar is a server-side ASP.NET web control (WebForms or MVC) that renders a calendar while giving you full control over the generated HTML. ...CESAVE: Proyectos para el Comité Estatal de Sanidad Vegetal.Closure Compiler w/ Annotations Visual Studio 2010 Snippets: This is an attempt to create reusable Visual Studio snippets to make working with closure compiler annotated JavaScript more productive. VS2010 ...Common Service Host: Common Service Host is a generic Windows Communication Service Host and factory that uses the Common Service Locator to create Service objects. ...DarkLight: DarkLight is a 2D Lighting Engine written in XNA, and allows developers to create 2D shadowing effects in their 2D games easily. It supports poi...Earn Burn Tracker: A tool to track earned value against a given release, initiative, feature set, and objects.eOfficeAACS: eOffice is an open source access control and attendance management system developed by e-bird Innovation (www.ebirdinfo.com).Its flexible design al...FLV Video conversion library for .Net 3.5: This is a component created to call the ffmpeg tool to convert various video formats to the Adobe Flash FLV output format. The component also takes...Google Moderator: .NET client library for the Google Moderator API.linq to jquery: provides support for linq to jquery objectsMobile Vikings Data: App to view your data usage RefBrowser: RefBrowserRESX Translator with Bing (from Microsoft Consulting Services, UK): A Windows Form application that automatically translates RESX files using Bing web servicesRhyduino - Remote Arduino Control via Managed Code: Rhyduino makes it easy for Visual Studio / Windows devs to control the Arduino using a computer. It's like supercharging your Arduino with all the ...SharePoint 2010 CSV Bulk Term Set Importer: Allows for multiple import of *.csv files to a given term group in SharePoint 2010 Term Store. It will create new term group based on the name pr...SharePoint Feature - Export history version to Excel: Add a function to list the action button, the ability to export history version of the item sheet to Excel from the specified date. Features suppo...SwEntry: A system that allows people to open doors by using a Bluetooth enabled phone. Things to Do with the DLR: This project is about ideas and sample code around the Dynamic Language Runtime.Work Recorder - Hold on own time!: Work Recorder is a office aid software which can recorde the time used on PC for researchers, office workers and students. And it is also a good he...xuezhixu: xuezhixu foundYaget: Yet Another Game Engine TechnologyNew ReleasesBackUpAnyWhere: backupanywhere RC1: this is the RC of our programBaseCalendar: BaseControls 1.0: BaseControls 1.0 contains the BaseCalendar ASP.NET control.BizTalk Server Pipeline Component Wizard: 2.20: Version suitable for 2010 release.CheckHeader: CheckHeader v0.8.6: The Microsoft .NET Framework 4.0 is needed to run this program.Chirpy - VS Add In For Handling Js, Css, and DotLess Files: Chirpy Installer for VS 2010 (Ver-1.0.0.2): VS 2010 Installer for the Chirpy AddIn. Version 1.0.0.2Christoc's DotNetNuke C# Module Development Template: 00.00.01: This is the initial release of Christoc's DotNetNuke C# Module Development Template. You can use the Template as-is, or you can customize the VSTem...Closure Compiler w/ Annotations Visual Studio 2010 Snippets: v1 release: The initial release of the projectCommunity Forums NNTP bridge: Community Forums NNTP Bridge V22: Release of the Community Forums NNTP Bridge to access the social and anwsers MS forums with a single, open source NNTP bridge. This release has ad...Community Forums NNTP bridge: Community Forums NNTP Bridge V23: Release of the Community Forums NNTP Bridge to access the social and anwsers MS forums with a single, open source NNTP bridge. This release has ad...Community Forums NNTP bridge: Community Forums NNTP Bridge V24: Release of the Community Forums NNTP Bridge to access the social and anwsers MS forums with a single, open source NNTP bridge. This release has ad...DarkLight: DarkLight Engine v1.0: This is the first version of the DarkLight engine and currently supports point, spot and area lights with no upper limit on the number of lights. ...DotNetNuke® Skin Collaborate: Collaborate Package 1.1.0: Newer version of Collaborate included fixes: - removed conditional code to display control panel - changed background color to match with backgroun...dotSpatial: System.Spatial.Projection Zip June 2, 2010: This version tries to fix a problem with reprojecting to UTM zones. It is still being tested though.Entity Framework Repository & Unit of Work Template: 1.0.1: This version has more than just the T4 template. I have added a new template that has a RepositoryHelper class for use with StructureMap. Also th...FLV Video conversion library for .Net 3.5: Beta 1: This is the first release of this project. Improvements may be added if necessary.HERB.IQ: Alpha 0.1 Preview: Only clone tab works, just setting up the GUI and getting the XML data handling working correctlyJetfire - Workflow DSL: V1.2.0: The complete source code required for a Jetfire system (server and client nexus) is included in the release. Highlights of Changes Full programmat...linq to jquery: linq to jquery alpha: beta development projectMapWindow6: MapWindow 6.0 June 2, 2010: This version fixes a problem with projecting to UTM zones. I'm not sure that this works perfectly yet. It seemed to require a zone adjustment by ...patterns & practices Web Client Developer Guidance: Developing Web Apps May 2010 Beta: This RelesaeThis drop includes updated documentation, links, and graphics. We are still looking for feedback on this release. Plans going forward...patterns & practices: Composite WPF and Silverlight: Prism 4.0 Drop 1: Prism 4.0 Drop 1 Welcome to the first drop of Prism 4.0 (formally known as the Composite Application Guidance for WPF and Silverlight). This drop i...Powershell4SQL: Version 1.3: Changes from version 1.2 Added support for -Confirm and -WhatIf parameters Added support for -Verbose mode. Includes SQL Batch text, parameters ...RESX Translator with Bing (from Microsoft Consulting Services, UK): v1.0: This is the initial release of the toolRhyduino - Remote Arduino Control via Managed Code: Beta Release (v0.80): LibraryAuto-detects connected Arduino devices. Uses system resources intelligently to take advantage of multiple CPU cores when present. Firmata ...SharePoint Feature - Export history version to Excel: Export Item List Version: - multilanguage support Czech, English Install: "C:\Program Files\Common Files\Microsoft Shared\web server extensions\12\BIN\stsadm.exe" -o addsol...Simulo: Simulo v2.5: That's the third release of Simulo (v2.5). For detailed info on what's new, read the changes log block at the project's home page. System requirem...Site Directory for SharePoint 2010 (from Microsoft Consulting Services, UK): v1.5: Please carefully follow the Installation Guideas there are additional actions that need to be undertaken in this release. As 1.4 with the followin...Spackle.NET: 4.1.0.0 Release: Added IEquatable<T> to Range<T>StreamInsight Samples: Microsoft StreamInsight Product Team Samples: This is the current snapshot of the samples created by the Streaminsight Product Team.Touch Mice: 0.1: Initial release of Touch MiceVCC: Latest build, v2.1.30602.0: Automatic drop of latest buildVivoSocial: VivoSocial 7.2.0: Version 7.2.0 of VivoSocial has been released. If you experienced any issues with the previous version, please update your modules to the 7.2.0 rel...Work Recorder - Hold on own time!: WorkRecorder 1.0: +Finished Version 1.0Most Popular ProjectsCommunity Forums NNTP bridgeOutSyncASP.NET MVC Time PlannerNeatUploadMoonyDesk (windows desktop widgets)Mute4eXpress Persistent Objects (XPO) ToolkitAgUnit - Silverlight unit testing with ReSharperASP.NET MVC ExtensionsAviva Solutions C# Coding GuidelinesMost Active ProjectsCommunity Forums NNTP bridgeGMap.NET - Great Maps for Windows Forms & PresentationRawrIonics Isapi Rewrite FilterN2 CMSpatterns & practices – Enterprise LibraryBlogEngine.NETGameSetFarseer Physics EngineMirror Testing System

    Read the article

  • How to Avoid Your Next 12-Month Science Project

    - by constant
    While most customers immediately understand how the magic of Oracle's Hybrid Columnar Compression, intelligent storage servers and flash memory make Exadata uniquely powerful against home-grown database systems, some people think that Exalogic is nothing more than a bunch of x86 servers, a storage appliance and an InfiniBand (IB) network, built into a single rack. After all, isn't this exactly what the High Performance Computing (HPC) world has been doing for decades? On the surface, this may be true. And some people tried exactly that: They tried to put together their own version of Exalogic, but then they discover there's a lot more to building a system than buying hardware and assembling it together. IT is not Ikea. Why is that so? Could it be there's more going on behind the scenes than merely putting together a bunch of servers, a storage array and an InfiniBand network into a rack? Let's explore some of the special sauce that makes Exalogic unique and un-copyable, so you can save yourself from your next 6- to 12-month science project that distracts you from doing real work that adds value to your company. Engineering Systems is Hard Work! The backbone of Exalogic is its InfiniBand network: 4 times better bandwidth than even 10 Gigabit Ethernet, and only about a tenth of its latency. What a potential for increased scalability and throughput across the middleware and database layers! But InfiniBand is a beast that needs to be tamed: It is true that Exalogic uses a standard, open-source Open Fabrics Enterprise Distribution (OFED) InfiniBand driver stack. Unfortunately, this software has been developed by the HPC community with fastest speed in mind (which is good) but, despite the name, not many other enterprise-class requirements are included (which is less good). Here are some of the improvements that Oracle's InfiniBand development team had to add to the OFED stack to make it enterprise-ready, simply because typical HPC users didn't have the need to implement them: More than 100 bug fixes in the pieces that were not related to the Message Passing Interface Protocol (MPI), which is the protocol that HPC users use most of the time, but which is less useful in the enterprise. Performance optimizations and tuning across the whole IB stack: From Switches, Host Channel Adapters (HCAs) and drivers to low-level protocols, middleware and applications. Yes, even the standard HPC IB stack could be improved in terms of performance. Ethernet over IB (EoIB): Exalogic uses InfiniBand internally to reach high performance, but it needs to play nicely with datacenters around it. That's why Oracle added Ethernet over InfiniBand technology to it that allows for creating many virtual 10GBE adapters inside Exalogic's nodes that are aggregated and connected to Exalogic's IB gateway switches. While this is an open standard, it's up to the vendor to implement it. In this case, Oracle integrated the EoIB stack with Oracle's own IB to 10GBE gateway switches, and made it fully virtualized from the beginning. This means that Exalogic customers can completely rewire their server infrastructure inside the rack without having to physically pull or plug a single cable - a must-have for every cloud deployment. Anybody who wants to match this level of integration would need to add an InfiniBand switch development team to their project. Or just buy Oracle's gateway switches, which are conveniently shipped with a whole server infrastructure attached! IPv6 support for InfiniBand's Sockets Direct Protocol (SDP), Reliable Datagram Sockets (RDS), TCP/IP over IB (IPoIB) and EoIB protocols. Because no IPv6 = not very enterprise-class. HA capability for SDP. High Availability is not a big requirement for HPC, but for enterprise-class application servers it is. Every node in Exalogic's InfiniBand network is connected twice for redundancy. If any cable or port or HCA fails, there's always a replacement link ready to take over. This requires extra magic at the protocol level to work. So in addition to Weblogic's failover capabilities, Oracle implemented IB automatic path migration at the SDP level to avoid unnecessary failover operations at the middleware level. Security, for example spoof-protection. Another feature that is less important for traditional users of InfiniBand, but very important for enterprise customers. InfiniBand Partitioning and Quality-of-Service (QoS): One of the first questions we get from customers about Exalogic is: “How can we implement multi-tenancy?” The answer is to partition your IB network, which effectively creates many networks that work independently and that are protected at the lowest networking layer possible. In addition to that, QoS allows administrators to prioritize traffic flow in multi-tenancy environments so they can keep their service levels where it matters most. Resilient IB Fabric Management: InfiniBand is a self-managing network, so a lot of the magic lies in coming up with the right topology and in teaching the subnet manager how to properly discover and manage the network. Oracle's Infiniband switches come with pre-integrated, highly available fabric management with seamless integration into Oracle Enterprise Manager Ops Center. In short: Oracle elevated the OFED InfiniBand stack into an enterprise-class networking infrastructure. Many years and multiple teams of manpower went into the above improvements - this is something you can only get from Oracle, because no other InfiniBand vendor can give you these features across the whole stack! Exabus: Because it's not About the Size of Your Network, it's How You Use it! So let's assume that you somehow were able to get your hands on an enterprise-class IB driver stack. Or maybe you don't care and are just happy with the standard OFED one? Anyway, the next step is to actually leverage that InfiniBand performance. Here are the choices: Use traditional TCP/IP on top of the InfiniBand stack, Develop your own integration between your middleware and the lower-level (but faster) InfiniBand protocols. While more bandwidth is always a good thing, it's actually the low latency that enables superior performance for your applications when running on any networking infrastructure: The lower the latency, the faster the response travels through the network and the more transactions you can close per second. The reason why InfiniBand is such a low latency technology is that it gets rid of most if not all of your traditional networking protocol stack: Data is literally beamed from one region of RAM in one server into another region of RAM in another server with no kernel/drivers/UDP/TCP or other networking stack overhead involved! Which makes option 1 a no-go: Adding TCP/IP on top of InfiniBand is like adding training wheels to your racing bike. It may be ok in the beginning and for development, but it's not quite the performance IB was meant to deliver. Which only leaves option 2: Integrating your middleware with fast, low-level InfiniBand protocols. And this is what Exalogic's "Exabus" technology is all about. Here are a few Exabus features that help applications leverage the performance of InfiniBand in Exalogic: RDMA and SDP integration at the JDBC driver level (SDP), for Oracle Weblogic (SDP), Oracle Coherence (RDMA), Oracle Tuxedo (RDMA) and the new Oracle Traffic Director (RDMA) on Exalogic. Using these protocols, middleware can communicate a lot faster with each other and the Oracle database than by using standard networking protocols, Seamless Integration of Ethernet over InfiniBand from Exalogic's Gateway switches into the OS, Oracle Weblogic optimizations for handling massive amounts of parallel transactions. Because if you have an 8-lane Autobahn, you also need to improve your ramps so you can feed it with many cars in parallel. Integration of Weblogic with Oracle Exadata for faster performance, optimized session management and failover. As you see, “Exabus” is Oracle's word for describing all the InfiniBand enhancements Oracle put into Exalogic: OFED stack enhancements, protocols for faster IB access, and InfiniBand support and optimizations at the virtualization and middleware level. All working together to deliver the full potential of InfiniBand performance. Who else has 100% control over their middleware so they can develop their own low-level protocol integration with InfiniBand? Even if you take an open source approach, you're looking at years of development work to create, test and support a whole new networking technology in your middleware! The Extras: Less Hassle, More Productivity, Faster Time to Market And then there are the other advantages of Engineered Systems that are true for Exalogic the same as they are for every other Engineered System: One simple purchasing process: No headaches due to endless RFPs and no “Will X work with Y?” uncertainties. Everything has been engineered together: All kinds of bugs and problems have been already fixed at the design level that would have only manifested themselves after you have built the system from scratch. Everything is built, tested and integrated at the factory level . Less integration pain for you, faster time to market. Every Exalogic machine world-wide is identical to Oracle's own machines in the lab: Instant replication of any problems you may encounter, faster time to resolution. Simplified patching, management and operations. One throat to choke: Imagine finger-pointing hell for systems that have been put together using several different vendors. Oracle's Engineered Systems have a single phone number that customers can call to get their problems solved. For more business-centric values, read The Business Value of Engineered Systems. Conclusion: Buy Exalogic, or get ready for a 6-12 Month Science Project And here's the reason why it's not easy to "build your own Exalogic": There's a lot of work required to make such a system fly. In fact, anybody who is starting to "just put together a bunch of servers and an InfiniBand network" is really looking at a 6-12 month science project. And the outcome is likely to not be very enterprise-class. And it won't have Exalogic's performance either. Because building an Engineered System is literally rocket science: It takes a lot of time, effort, resources and many iterations of design/test/analyze/fix to build such a system. That's why InfiniBand has been reserved for HPC scientists for such a long time. And only Oracle can bring the power of InfiniBand in an enterprise-class, ready-to use, pre-integrated version to customers, without the develop/integrate/support pain. For more details, check the new Exalogic overview white paper which was updated only recently. P.S.: Thanks to my colleagues Ola, Paul, Don and Andy for helping me put together this article! var flattr_uid = '26528'; var flattr_tle = 'How to Avoid Your Next 12-Month Science Project'; var flattr_dsc = 'While most customers immediately understand how the magic of Oracle's Hybrid Columnar Compression, intelligent storage servers and flash memory make Exadata uniquely powerful against home-grown database systems, some people think that Exalogic is nothing more than a bunch of x86 servers, a storage appliance and an InfiniBand (IB) network, built into a single rack.After all, isn't this exactly what the High Performance Computing (HPC) world has been doing for decades?On the surface, this may be true. And some people tried exactly that: They tried to put together their own version of Exalogic, but then they discover there's a lot more to building a system than buying hardware and assembling it together. IT is not Ikea.Why is that so? Could it be there's more going on behind the scenes than merely putting together a bunch of servers, a storage array and an InfiniBand network into a rack? Let's explore some of the special sauce that makes Exalogic unique and un-copyable, so you can save yourself from your next 6- to 12-month science project that distracts you from doing real work that adds value to your company.'; var flattr_tag = 'Engineered Systems,Engineered Systems,Infiniband,Integration,latency,Oracle,performance'; var flattr_cat = 'text'; var flattr_url = 'http://constantin.glez.de/blog/2012/04/how-avoid-your-next-12-month-science-project'; var flattr_lng = 'en_GB'

    Read the article

  • MSCC: Purpose and benefits of Version Control Systems (VCS)

    Unfortunately, there was no monthly meetup during May. Which means that it was even more important and interesting to go forward with a great topic for this month. Earlier this year I already spoke to Nayar Joolfoo about doing a presentation on version control systems (VCS), and he gladly agreed since then. It was just about finding the right date for the action. Furthermore, it was also a great coincidence that Avinash Meetoo announced on social media networks that Knowledge 7 is about to have a new training on "Effective git" - which correlates to a book title Avinash is currently working on - all the best with your approach on this and reach out to our MSCC craftsmen for recessions. Once again a big Thank you to Orange Ebene Accelerator on providing the venue for us, and the MSCC members involved on securing the time slot for our event. Unfortunately, it's kind of tough to get an early confirmation for our meetups these days. I'll keep you posted on that one as there are some interesting and exciting options coming up soon. Okay, let's talk about the meeting and version control systems again. As usual, I'm going to put my first impression of the meetup: "Absolutely great topic, questions and discussions on version control systems, like git or VSO. I was also highly pleased by the number of first timers and female IT geeks. Hopefully, we will be able to keep this trend for future get-togethers." And I really have to emphasise the amount of fresh blood coming to our gathering. Also, during the initial phase it was surprising to see that exactly those first-timers, most of them students at various campuses here on the island, had absolutely no idea about version control systems. More about further down... Reactions of other attendees If I counted correctly, we had a total of 17 attendees this month, and I'd like to give you feedback from some of them: "Inspiring. Helped me understand more about GIT." -- Sean on event comments "Joined the meetup today with literally no idea what is a version control system. I have several reasons why I should be starting to use VCS as from NOW in my projects. Thanks Nayar, Jochen and other participants :)" -- Yudish on event comments "Was present today and I'm very satisfied.I was not aware if there was a such tool like git available. Thanks to those who contributed for this meetup.It was great. Learned a lot from this meetup!!" -- Leonardo on event comments "Seriously, I can see how it’s going to ease my task and help me save time. Gone are the issues with files backups.  And since I’ll be doing my dissertation this year, using Git would help me a lot for my backups and I’m grateful to Nayar for the great explanation." -- Swan-Iyah on MSCC meetup : Version Controls Hopefully, I'll be able to get some other sources - personal blogs preferred - on our meeting. Geeks, thank you so much for those encouraging comments. It's really great to experience that we, all members of the MSCC, are doing the right thing to get more IT information out, and to help each other to improve and evolve in our professional careers. Our agenda of the day Honestly, we had a bumpy start... First, I was battling a little bit with the movable room divider in order to maximize the space. I mean, we had 24 RSVPs and usually there might additional people coming along. Then, for what ever reason, we were facing power outages - actually twice in short periods. Not too good for the projector after all, but hey it went smooth for the rest of the time being. And last but not least... our first speaker Nayar got stuck somewhere on the road. ;-) Anyway, not a real show-stopper and we used the time until Nayar's arrival to introduce ourselves a little bit. It is always important for me to get to know the "newbies" a little bit, and as a result we had lots of students of university - first year, second year and recent graduates - among them. Surprisingly, none of them was ever in contact with version control systems at all. I mean, this is a shocking discovery! Similar to the ability of touch-typing I'd say that being able to use (and master) any kind of version control system is compulsory in any job in the IT industry. Seriously, I'm wondering what is being taught during the classes on the campus. All of them have to work on semester assessments or final projects, even in small teams of 2-4 people. That's the perfect occasion to get started with VCS. Already in this phase, we had great input from more experienced VCS users, like Sean, Avinash and myself. git - a modern approach to VCS - Nayar What a tour! Nayar gave us the full round of git from start to finish, even touching some more advanced techniques. First, he started to explain about the importance of version control systems as an essential tool for software developers, even working alone on a project, and the ability to have a kind of "time machine" that allows you to inspect and revert to a previous version of source code at any time. Then he showed how easy it is to install git on an Ubuntu based system but also mentioned that git is literally available for any operating system, like Windows, Mac OS X and of course other Linux distributions. Next, he showed us how to set the initial configuration values of user name and email address which simplifies the daily usage of the git client while working with your repositories. Then he initialised and added a new repository for some local development of a blogging software. All commands were done using the command line interface (CLI) so that they can be repeated on any system as reference. The syntax and the procedure is always the same, and Nayar clearly mentioned this to the attendees. Now, having a git repository in place it was about time to work on some "important" changes on the blogging software - just for the sake of demonstrating the ease of use and power of git. One interesting question came very early: "How many commands do we have to learn? It looks quite difficult at the moment" - Well, rest assured that during daily development circles you will need less than 10 git commands on a regular base: git add, commit, push, pull, checkout, and merge And Nayar demo'd all of them. Much to the delight of everyone he also showed gitk which is the git repository browser. It's an UI tool to display changes in a repository or a selected set of commits. This includes visualizing the commit graph, showing information related to each commit, and the files in the trees of each revision. Using gitk to display and browse information of a local git repository And last but not least, we took advantage of the internet connectivity and reached out to various online portals offering git hosting for free. Nayar showed us how to push the local repository into a remote system on github. Showing the web-based git browser and history handling, and then also explained and demo'd on how to connect to existing online repositories in order to get access to either your own source code or other people's open source projects. Next to github, we also spoke about bitbucket and gitlab as potential online platforms for your projects. Have a look at the conditions and details about their free service packages and what you can get additionally as a paying customer. Usually, you already get a lot of services for up to five users for free but there might be other important aspects that might have an impact on your decision. Anyways, moving git-based repositories between systems is a piece of cake, and changing online platforms is possible at any stage of your development. Visual Studio Online (VSO) - Jochen Well, Nayar literally covered all elements of working with git during his session, including the use of external online platforms. So, what would be the advantage of talking about Visual Studio Online (VSO)? First of all, VSO is "just another" online platform for hosting and managing git repositories on remote systems, equivalent to github, bitbucket, or any other web site. At the moment (of writing), Microsoft also provides a free package of up to five users / developers on a git repository but there is more in that package. Of course, it is related to software development on the Windows systems and the bonds are tightened towards the use of Visual Studio but out of experience you are absolutely not restricted to that. Connecting a Linux or Mac OS X machine with a git client or an integrated development environment (IDE) like Eclipse or Xcode works as smooth as expected. So, why should one opt in for VSO? Well, one of the main aspects that I would like to mention here is that VSO integrates the Application Life Cycle Methodology (ALM) of Microsoft in their platform. Meaning that you get agile project management with Backlogs, Sprints, Burn-down charts as well as the ability to track tasks, bug reports and work items next to collaborative team chats. It's the whole package of agile development you'll get. And, something I mentioned briefly during the begin of our meeting, VSO gives you the possibility of an automated continuous integrated (CI) process which builds and can run tests of your source code after each commit of changes. Having a proper CI strategy is also part of the Clean Code Developer practices - on Level Green actually -, and not only simplifies your life as a software developer but also reduces the sources of potential errors. Seamless integration and automated deployment between Microsoft Azure Web Sites and git repository But my favourite feature is the seamless continuous deployment to Microsoft Azure. Especially, while working on web projects it's absolutely astounishing that as soon as you commit your chances it just takes a couple of seconds until your modifications are deployed and available on your Azure-hosted web sites. Upcoming Events and networking Due to the adjusted times, everybody was kind of hungry and we didn't follow up on networking or upcoming events - very unfortunate to my opinion and this will have an impact on future planning of our meetups. Because I rather would like to see more conversations during and at the end of our meetings than everyone just packing their laptops, bags and accessories and rush off to grab some food. I was hoping to get some information regarding this year's Code Challenge - supposedly to be organised during July? Maybe someone could leave a comment on that - but I couldn't get any updates. Well, I'll keep digging... In case that you would like to get more into git and how to use it effectively, please check out Knowledge 7's upcoming course on "Effective git". Thanks Avinash for your vital input into today's conversation and I'm looking forward to get a grip on your book title very soon. My resume of the day Do not work in IT without any kind of version control system! Seriously, without a VCS in place you're doing it wrong. It's like driving a car without seat belts attached or riding your bike without safety helmet. You don't do that! End of discussion. ;-) Nowadays, having access to free (as in cost) tools to install on your machine and numerous online platforms to host your source code for free for up to five users it's a no-brainer to get yourself familiar with VCS. Today's sessions gave a good overview on how to start using git and how to connect to various remote services like github or VSO.

    Read the article

  • How I Work: A Cloud Developer's Workstation

    - by BuckWoody
    I've written here a little about how I work during the day, including things like using a stand-up desk (still doing that, by the way). Inspired by a Twitter conversation yesterday, I thought I might explain how I set up my computing environment. First, a couple of important points. I work in Cloud Computing, specifically (but not limited to) Windows Azure. Windows Azure has features to run a Virtual Machine (IaaS), run code without having to control a Virtual Machine (PaaS) and use databases, video streaming, Hadoop and more (a kind of SaaS for tech pros). As such, my designs run the gamut of on-premises, VM's in the Cloud, and software that I write for a platform. I focus on data primarily, meaning that I design a lot of systems that use an RDBMS (like SQL Server or Windows Azure Databases) or a NoSQL approach (MongoDB on Azure or large-scale Key-Value Pairs in Table storage) and even Hadoop and R, and also Cloud Numerics in F#. All that being said, those things inform my choices below. Hardware I have a Lenovo X220 tablet/laptop which I really like a great deal - it's a light, tough, extremely fast system. When I travel, that's the system I take. It has 8GB of RAM, and an SSD drive. I sometimes use that to develop or work at a client's site, on the road, or in the living room when I'm not in my home office. My main system is a GateWay DX430017 - I've maxed it out on RAM, and I have two 1TB drives in it. It's not only my workstation for work; I leave it on all the time and it streams our videos, music and books. I have about 3400 e-books, and I've just started using Calibre to stream the library. I run Windows 8 on it so I can set up Hyper-V images, since Windows Azure allows me to move regular Hyper-V disks back and forth to the Cloud. That's where all my "servers" are, when I have to use an IaaS approach. The reason I use a desktop-style system rather than a laptop only approach is that a good part of my job is setting up architectures to solve really big, complex problems. That means I have to simulate entire networks on-premises, along with the Hybrid Cloud approach I use a lot. I need a lot of disk space and memory for that, and I use two huge monitors on my stand-up desk. I could probably use 10 monitors if I had the room for them. Also, since it's our home system as well, I leave it on all the time and it doesn't travel.   Software For the software for my systems, it's important to keep in mind that I not only write code, but I design databases, teach, present, and create Linux and other environments. Windows 8 - While the jury is out for me on the new interface, the context-sensitive search, integrated everything, and speed is just hands-down the right choice. I've evaluated a server OS, Linux, even an Apple, but I just am not as efficient on those as I am with Windows 8. Visual Studio Ultimate - I develop primarily in .NET (C# and F# mostly) and I use the Team Foundation Server in the cloud, and I'm asked to do everything from UI to Services, so I need everything. Windows Azure SDK, Windows Azure Training Kit - I need the first to set up my Azure PaaS coding, and the second has all the info I need for PaaS, IaaS and SaaS. This is primarily how I get paid. :) SQL Server Developer Edition - While I might install Oracle, MySQL and Postgres on my VM's, the "outside" environment is SQL Server for an RDBMS. I install the Developer Edition because it has the same features as Enterprise Edition, and comes with all the client tools and documentation. Microsoft Office -  Even if I didn't work here, this is what I would use. I've just grown too accustomed to doing business this way to change, so my advice is always "use what works", and this does. The parts I use are: OneNote (and a Math Add-in) - I do almost everything - and I mean everything in OneNote. I can code, do high-end math, present, design, collaborate and more. All my notebooks are on my Skydrive. I can use them from any system, anywhere. If you take the time to learn this program, you'll be hooked. Excel with PowerPivot - Don't make that face. Excel is the world's database, and every Data Scientist I know - even the ones where I teach at the University of Washington - know it, use it, and love it.  Outlook - Primary communications, CRM and contact tool. I have all of my social media hooked up to it, so when I get an e-mail from you, I see everything, see all the history we've had on e-mail, find you on a map and more. Lync - I was fine with LiveMeeting, although it has it's moments. For me, the Lync client is tres-awesome. I use this throughout my day, present on it, stay in contact with colleagues and the folks on the dev team (who wish I didn't have it) and more.  PowerPoint - Once again, don't make that face. Whenever I see someone complaining about PowerPoint, I have 100% of the time found they don't know how to use it. If you suck at presenting or creating content, don't blame PowerPoint. Works great on my machine. :) Zoomit - Magnifier - On Windows 7 (and 8 as well) there's a built-in magnifier, but I install Zoomit out of habit. It enlarges the screen. If you don't use one of these tools (or their equivalent on some other OS) then you're presenting/teaching wrong, and you should stop presenting/teaching until you get them and learn how to show people what you can see on your tiny, tiny monitor. :) Cygwin - Unix for Windows. OK, that's not true, but it's mostly that. I grew up on mainframes and Unix (IBM and HP, thank you) and I can't imagine life without  sed, awk, grep, vim, and bash. I also tend to take a lot of the "Science" and "Development" and "Database" packages in it as well. PuTTY - Speaking of Unix, when I need to connect to my Linux VM's in Windows Azure, I want to do it securely. This is the tool for that. Notepad++ - Somewhere between torturing myself in vim and luxuriating in OneNote is Notepad++. Everyone has a favorite text editor; this one is mine. Too many features to name, and it's free. Browsers - I install Chrome, Firefox and of course IE. I know it's in vogue to rant on IE, but I tend to think for myself a great deal, and I've had few (none) problems with it. The others I have for the haterz that make sites that won't run in IE. Visio - I've used a lot of design packages, but none have the extreme meta-data edit capabilities of Visio. I don't use this all the time - it can be rather heavy, but what it does it does really well. I also present this way when I'm not using PowerPoint. Yup, I just bring up Visio and diagram away as I'm chatting with clients. Depending on what we're covering, this can be the right tool for that. Tweetdeck - The AIR one, not that new disaster they came out with. I live on social media, since you, dear readers, are my cube-mates. When I get tired of you all, I close Tweetdeck. When I need help or someone needs help from me, or if I want to see a picture of a cat while I'm coding, I bring it up. It's up most all day and night. Windows Media Player - I listen to Trance or Classical when I code, and I find music managers overbearing and extra. I just use what comes in the box, and it works great for me. R - F# and Cloud Numerics now allows me to load in R libraries (yay!) and I use this for statistical work on big data loads. Microsoft Math - One of the most amazing, free, rich, amazing, awesome, amazing calculators out there. I get the 64-bit version for quick math conversions, plots and formula-checks. Python - I know, right? Who knew that the scientific community loved Python so much. But they do. I use 2.7; not as much runs with 3+. I also use IronPython in Visual Studio, or I edit in Notepad++ Camstudio recorder - Windows PSR - In much of my training, and all of my teaching at the UW, I need to show a process on a screen. Camstudio records screen and voice, and it's free. If I need to make static training, I use the Windows PSR tool that's built right in. It's ostensibly for problem duplication, but I use it to record for training.   OK - your turn. Post a link to your blog entry below, and tell me how you set your system up.  

    Read the article

  • Making a Case For The Command Line

    - by Jesse Taber
    Originally posted on: http://geekswithblogs.net/GruffCode/archive/2013/06/30/making-a-case-for-the-command-line.aspxI have had an idea percolating in the back of my mind for over a year now that I’ve just recently started to implement. This idea relates to building out “internal tools” to ease the maintenance and on-going support of a software system. The system that I currently work on is (mostly) web-based, so we traditionally we have built these internal tools in the form of pages within the app that are only accessible by our developers and support personnel. These pages allow us to perform tasks within the system that, for one reason or another, we don’t want to let our end users perform (e.g. mass create/update/delete operations on data, flipping switches that turn paid modules of the system on or off, etc). When we try to build new tools like this we often struggle with the level of effort required to build them. Effort Required Creating a whole new page in an existing web application can be a fairly large undertaking. You need to create the page and ensure it will have a layout that is consistent with the other pages in the app. You need to decide what types of input controls need to go onto the page. You need to ensure that everything uses the same style as the rest of the site. You need to figure out what the text on the page should say. Then, when you figure out that you forgot about an input that should really be present you might have to go back and re-work the entire thing. Oh, and in addition to all of that, you still have to, you know, write the code that actually performs the task. Everything other than the code that performs the task at hand is just overhead. We don’t need a fancy date picker control in a nicely styled page for the vast majority of our internal tools. We don’t even really need a page, for that matter. We just need a way to issue a command to the application and have it, in turn, execute the code that we’ve written to accomplish a given task. All we really need is a simple console application! Plumbing Problems A former co-worker of mine, John Sonmez, always advocated the Unix philosophy for building internal tools: start with something that runs at the command line, and then build a UI on top of that if you need to. John’s idea has a lot of merit, and we tried building out some internal tools as simple Console applications. Unfortunately, this was often easier said that done. Doing a “File –> New Project” to build out a tool for a mature system can be pretty daunting because that new project is totally empty.  In our case, the web application code had a lot of of “plumbing” built in: it managed authentication and authorization, it handled database connection management for our multi-tenanted architecture, it managed all of the context that needs to follow a user around the application such as their timezone and regional/language settings. In addition, the configuration file for the web application  (a web.config in our case because this is an ASP .NET application) is large and would need to be reproduced into a similar configuration file for a Console application. While most of these problems are could be solved pretty easily with some refactoring of the codebase, building Console applications for internal tools still potentially suffers from one pretty big drawback: you’d have to execute them on a machine with network access to all of the needed resources. Obviously, our web servers can easily communicate the the database servers and can publish messages to our service bus, but the same is not true for all of our developer and support personnel workstations. We could have everyone run these tools remotely via RDP or SSH, but that’s a bit cumbersome and certainly a lot less convenient than having the tools built into the web application that is so easily accessible. Mix and Match So we need a way to build tools that are easily accessible via the web application but also don’t require the overhead of creating a user interface. This is where my idea comes into play: why not just build a command line interface into the web application? If it’s part of the web application we get all of the plumbing that comes along with that code, and we’re executing everything on the web servers which means we’ll have access to any external resources that we might need. Rather than having to incur the overhead of creating a brand new page for each tool that we want to build, we can create one new page that simply accepts a command in text form and executes it as a request on the web server. In this way, we can focus on writing the code to accomplish the task. If the tool ends up being heavily used, then (and only then) should we consider spending the time to build a better user experience around it. To be clear, I’m not trying to downplay the importance of building great user experiences into your system; we should all strive to provide the best UX possible to our end users. I’m only advocating this sort of bare-bones interface for internal consumption by the technical staff that builds and supports the software. This command line interface should be the “back end” to a highly polished and eye-pleasing public face. Implementation As I mentioned at the beginning of this post, this is an idea that I’ve had for awhile but have only recently started building out. I’ve outlined some general guidelines and design goals for this effort as follows: Text in, text out: In the interest of keeping things as simple as possible, I want this interface to be purely text-based. Users will submit commands as plain text, and the application will provide responses in plain text. Obviously this text will be “wrapped” within the context of HTTP requests and responses, but I don’t want to have to think about HTML or CSS when taking input from the user or displaying responses back to the user. Task-oriented code only: After building the initial “harness” for this interface, the only code that should need to be written to create a new internal tool should be code that is expressly needed to accomplish the task that the tool is intended to support. If we want to encourage and enable ourselves to build good tooling, we need to lower the barriers to entry as much as possible. Built-in documentation: One of the great things about most command line utilities is the ‘help’ switch that provides usage guidelines and details about the arguments that the utility accepts. Our web-based command line utility should allow us to build the documentation for these tools directly into the code of the tools themselves. I finally started trying to implement this idea when I heard about a fantastic open-source library called CLAP (Command Line Auto Parser) that lets me meet the guidelines outlined above. CLAP lets you define classes with public methods that can be easily invoked from the command line. Here’s a quick example of the code that would be needed to create a new tool to do something within your system: 1: public class CustomerTools 2: { 3: [Verb] 4: public void UpdateName(int customerId, string firstName, string lastName) 5: { 6: //invoke internal services/domain objects/hwatever to perform update 7: } 8: } This is just a regular class with a single public method (though you could have as many methods as you want). The method is decorated with the ‘Verb’ attribute that tells the CLAP library that it is a method that can be invoked from the command line. Here is how you would invoke that code: Parser.Run(args, new CustomerTools()); Note that ‘args’ is just a string[] that would normally be passed passed in from the static Main method of a Console application. Also, CLAP allows you to pass in multiple classes that define [Verb] methods so you can opt to organize the code that CLAP will invoke in any way that you like. You can invoke this code from a command line application like this: SomeExe UpdateName -customerId:123 -firstName:Jesse -lastName:Taber ‘SomeExe’ in this example just represents the name of .exe that is would be created from our Console application. CLAP then interprets the arguments passed in order to find the method that should be invoked and automatically parses out the parameters that need to be passed in. After a quick spike, I’ve found that invoking the ‘Parser’ class can be done from within the context of a web application just as easily as it can from within the ‘Main’ method entry point of a Console application. There are, however, a few sticking points that I’m working around: Splitting arguments into the ‘args’ array like the command line: When you invoke a standard .NET console application you get the arguments that were passed in by the user split into a handy array (this is the ‘args’ parameter referenced above). Generally speaking they get split by whitespace, but it’s also clever enough to handle things like ignoring whitespace in a phrase that is surrounded by quotes. We’ll need to re-create this logic within our web application so that we can give the ‘args’ value to CLAP just like a console application would. Providing a response to the user: If you were writing a console application, you might just use Console.WriteLine to provide responses to the user as to the progress and eventual outcome of the command. We can’t use Console.WriteLine within a web application, so I’ll need to find another way to provide feedback to the user. Preferably this approach would allow me to use the same handler classes from both a Console application and a web application, so some kind of strategy pattern will likely emerge from this effort. Submitting files: Often an internal tool needs to support doing some kind of operation in bulk, and the easiest way to submit the data needed to support the bulk operation is in a file. Getting the file uploaded and available to the CLAP handler classes will take a little bit of effort. Mimicking the console experience: This isn’t really a requirement so much as a “nice to have”. To start out, the command-line interface in the web application will probably be a single ‘textarea’ control with a button to submit the contents to a handler that will pass it along to CLAP to be parsed and run. I think it would be interesting to use some javascript and CSS trickery to change that page into something with more of a “shell” interface look and feel. I’ll be blogging more about this effort in the future and will include some code snippets (or maybe even a full blown example app) as I progress. I also think that I’ll probably end up either submitting some pull requests to the CLAP project or possibly forking/wrapping it into a more web-friendly package and open sourcing that.

    Read the article

  • 8 Reasons Why Even Microsoft Agrees the Windows Desktop is a Nightmare

    - by Chris Hoffman
    Let’s be honest: The Windows desktop is a mess. Sure, it’s extremely powerful and has a huge software library, but it’s not a good experience for average people. It’s not even a good experience for geeks, although we tolerate it. Even Microsoft agrees about this. Microsoft’s Surface tablets with Windows RT don’t support any third-party desktop apps. They consider this a feature — users can’t install malware and other desktop junk, so the system will always be speedy and secure. Malware is Still Common Malware may not affect geeks, but it certainly continues to affect average people. Securing Windows, keeping it secure, and avoiding unsafe programs is a complex process. There are over 50 different file extensions that can contain harmful code to keep track of. It’s easy to have theoretical discussions about how malware could infect Mac computers, Android devices, and other systems. But Mac malware is extremely rare, and has  generally been caused by problem with the terrible Java plug-in. Macs are configured to only run executables from identified developers by default, whereas Windows will run everything. Android malware is talked about a lot, but Android malware is rare in the real world and is generally confined to users who disable security protections and install pirated apps. Google has also taken action, rolling out built-in antivirus-like app checking to all Android devices, even old ones running Android 2.3, via Play Services. Whatever the reason, Windows malware is still common while malware for other systems isn’t. We all know it — anyone who does tech support for average users has dealt with infected Windows computers. Even users who can avoid malware are stuck dealing with complex and nagging antivirus programs, especially since it’s now so difficult to trust Microsoft’s antivirus products. Manufacturer-Installed Bloatware is Terrible Sit down with a new Mac, Chromebook, iPad, Android tablet, Linux laptop, or even a Surface running Windows RT and you can enjoy using your new device. The system is a clean slate for you to start exploring and installing your new software. Sit down with a new Windows PC and the system is a mess. Rather than be delighted, you’re stuck reinstalling Windows and then installing the necessary drivers or you’re forced to start uninstalling useless bloatware programs one-by-one, trying to figure out which ones are actually useful. After uninstalling the useless programs, you may end up with a system tray full of icons for ten different hardware utilities anyway. The first experience of using a new Windows PC is frustration, not delight. Yes, bloatware is still a problem on Windows 8 PCs. Manufacturers can customize the Refresh image, preventing bloatware rom easily being removed. Finding a Desktop Program is Dangerous Want to install a Windows desktop program? Well, you’ll have to head to your web browser and start searching. It’s up to you, the user, to know which programs are safe and which are dangerous. Even if you find a website for a reputable program, the advertisements on that page will often try to trick you into downloading fake installers full of adware. While it’s great to have the ability to leave the app store and get software that the platform’s owner hasn’t approved — as on Android — this is no excuse for not providing a good, secure software installation experience for typical users installing typical programs. Even Reputable Desktop Programs Try to Install Junk Even if you do find an entirely reputable program, you’ll have to keep your eyes open while installing it. It will likely try to install adware, add browse toolbars, change your default search engine, or change your web browser’s home page. Even Microsoft’s own programs do this — when you install Skype for Windows desktop, it will attempt to modify your browser settings t ouse Bing, even if you’re specially chosen another search engine and home page. With Microsoft setting such an example, it’s no surprise so many other software developers have followed suit. Geeks know how to avoid this stuff, but there’s a reason program installers continue to do this. It works and tricks many users, who end up with junk installed and settings changed. The Update Process is Confusing On iOS, Android, and Windows RT, software updates come from a single place — the app store. On Linux, software updates come from the package manager. On Mac OS X, typical users’ software updates likely come from the Mac App Store. On the Windows desktop, software updates come from… well, every program has to create its own update mechanism. Users have to keep track of all these updaters and make sure their software is up-to-date. Most programs now have their act together and automatically update by default, but users who have old versions of Flash and Adobe Reader installed are vulnerable until they realize their software isn’t automatically updating. Even if every program updates properly, the sheer mess of updaters is clunky, slow, and confusing in comparison to a centralized update process. Browser Plugins Open Security Holes It’s no surprise that other modern platforms like iOS, Android, Chrome OS, Windows RT, and Windows Phone don’t allow traditional browser plugins, or only allow Flash and build it into the system. Browser plugins provide a wealth of different ways for malicious web pages to exploit the browser and open the system to attack. Browser plugins are one of the most popular attack vectors because of how many users have out-of-date plugins and how many plugins, especially Java, seem to be designed without taking security seriously. Oracle’s Java plugin even tries to install the terrible Ask toolbar when installing security updates. That’s right — the security update process is also used to cram additional adware into users’ machines so unscrupulous companies like Oracle can make a quick buck. It’s no wonder that most Windows PCs have an out-of-date, vulnerable version of Java installed. Battery Life is Terrible Windows PCs have bad battery life compared to Macs, IOS devices, and Android tablets, all of which Windows now competes with. Even Microsoft’s own Surface Pro 2 has bad battery life. Apple’s 11-inch MacBook Air, which has very similar hardware to the Surface Pro 2, offers double its battery life when web browsing. Microsoft has been fond of blaming third-party hardware manufacturers for their poorly optimized drivers in the past, but there’s no longer any room to hide. The problem is clearly Windows. Why is this? No one really knows for sure. Perhaps Microsoft has kept on piling Windows component on top of Windows component and many older Windows components were never properly optimized. Windows Users Become Stuck on Old Windows Versions Apple’s new OS X 10.9 Mavericks upgrade is completely free to all Mac users and supports Macs going back to 2007. Apple has also announced their intention that all new releases of Mac OS X will be free. In 2007, Microsoft had just shipped Windows Vista. Macs from the Windows Vista era are being upgraded to the latest version of the Mac operating system for free, while Windows PCs from the same era are probably still using Windows Vista. There’s no easy upgrade path for these people. They’re stuck using Windows Vista and maybe even the outdated Internet Explorer 9 if they haven’t installed a third-party web browser. Microsoft’s upgrade path is for these people to pay $120 for a full copy of Windows 8.1 and go through a complicated process that’s actaully a clean install. Even users of Windows 8 devices will probably have to pay money to upgrade to Windows 9, while updates for other operating systems are completely free. If you’re a PC geek, a PC gamer, or someone who just requires specialized software that only runs on Windows, you probably use the Windows desktop and don’t want to switch. That’s fine, but it doesn’t mean the Windows desktop is actually a good experience. Much of the burden falls on average users, who have to struggle with malware, bloatware, adware bundled in installers, complex software installation processes, and out-of-date software. In return, all they get is the ability to use a web browser and some basic Office apps that they could use on almost any other platform without all the hassle. Microsoft would agree with this, touting Windows RT and their new “Windows 8-style” app platform as the solution. Why else would Microsoft, a “devices and services” company, position the Surface — a device without traditional Windows desktop programs — as their mass-market device recommended for average people? This isn’t necessarily an endorsement of Windows RT. If you’re tech support for your family members and it comes time for them to upgrade, you may want to get them off the Windows desktop and tell them to get a Mac or something else that’s simple. Better yet, if they get a Mac, you can tell them to visit the Apple Store for help instead of calling you. That’s another thing Windows PCs don’t offer — good manufacturer support. Image Credit: Blanca Stella Mejia on Flickr, Collin Andserson on Flickr, Luca Conti on Flickr     

    Read the article

  • CodePlex Daily Summary for Saturday, December 15, 2012

    CodePlex Daily Summary for Saturday, December 15, 2012Popular ReleasesAmarok Framework Library: 1.12: refactored agent-specific implementation introduced new interface representing important concepts like IDispatcher, IPublisher Refer to the Release Notes for details.sb0t v.5: sb0t 5.02b beta 2: Bug fix for Windows XP users (tab resize) Bug fix where ib0t stopped working A lot of command tweaksCRM 2011 Visual Ribbon Editor: Visual Ribbon Editor (1.3.1116.8): [FIX] Fixed issue not displaying CRM system button images correctly (incorrect path in file VisualRibbonEditor.exe.config)MySqlBackup.NET - MySQL Backup Solution for C#, VB.NET, ASP.NET: MySqlBackup.NET 1.5.6 beta: Fix Bug: If encryption is applied in Export Process, the generated encrypted SQL file is not able to import Stored Procedures, Functions, Triggers, Events and View. Fix Bug: In some unknown cases, the SHOW CREATE TABLE `tablename` query will return byte array. Improve 1: During Export, StreamWriter is opened and closed several times when writting to the dump file, which this is considered as not a good practice. Improve 2: SQL line in class Database method GetEvents: "SHOW EVENTS WHERE ...My Expenses Windows Store LOB App Demo: My Expenses Version 1: This is version 1 of the MyExpenses Windows 8 line of business demo app. The app is written in XAML and C#. It calls a WCF service that works with a SQL Server database. The app uses the Callisto toolkit. You can get it at https://github.com/timheuer/callisto. The Expenses.sql file contains the SQL to create the Expenses database. The ExpensesWCFService.zip file contains the WCF service, also written in C#. You should create a WCF service. Create an Entity Framework model and point it to...EasyTwitter: EasyTwitter basic operations: See the commit log to see the features!, check history files to see examples of how to use this libraryCommand Line Parser Library: 1.9.3.31 rc0: Main assembly CommandLine.dll signed. Removed old commented code. Added missing XML documentation comments. Two (very) minor code refactoring changes.BlackJumboDog: Ver5.7.4: 2012.12.13 Ver5.7.4 (1)Web???????、???????????????????????????????????????????VFPX: ssClasses A1.0: My initial release. See https://vfpx.codeplex.com/wikipage?title=ssClasses&referringTitle=Home for a brief description of what is inside this releaseHome Access Plus+: v8.6: v8.6.1213.1220 Added: Group look up to the visible property of the Booking System Fixed: Switched to using the outlook/exchange thumbnailPhoto instead of jpegPhoto Added: Add a blank paragraph below the tiles. This means that the browser displays a vertical scroller when resizing the window. Previously it was possible for the bottom edge of a tile not to be visible if the browser window was resized. Added: Booking System: Only Display Day+Month on the booking Home Page. This allows for the cs...Layered Architecture Solution Guidance (LASG): LASG 1.0.0.8 for Visual Studio 2012: PRE-REQUISITES Open GAX (Please install Oct 4, 2012 version) Microsoft® System CLR Types for Microsoft® SQL Server® 2012 Microsoft® SQL Server® 2012 Shared Management Objects Microsoft Enterprise Library 5.0 (for the generated code) Windows Azure SDK (for layered cloud applications) Silverlight 5 SDK (for Silverlight applications) THE RELEASE This release only works on Visual Studio 2012. Known Issue If you choose the Database project, the solution unfolding time will be slow....Fiskalizacija za developere: FiskalizacijaDev 2.0: Prva prava produkcijska verzija - Zakon je tu, ova je verzija uskladena sa trenutno važecom Tehnickom specifikacijom (v1.2. od 04.12.2012.) i spremna je za produkcijsko korištenje. Verzije iza ove ce ovisiti o naknadnim izmjenama Zakona i/ili Tehnicke specifikacije, odnosno, o eventualnim greškama u radu/zahtjevima community-a za novim feature-ima. Novosti u v2.0 su: - That assembly does not allow partially trusted callers (http://fiskalizacija.codeplex.com/workitem/699) - scheme IznosType...Simple Injector: Simple Injector v1.6.1: This patch release fixes a bug in the integration libraries that disallowed the application to start when .NET 4.5 was not installed on the machine (but only .NET 4.0). The following packages are affected: SimpleInjector.Integration.Web.dll SimpleInjector.Integration.Web.Mvc.dll SimpleInjector.Integration.Wcf.dll SimpleInjector.Extensions.LifetimeScoping.dllBootstrap Helpers: Version 1: First releasesheetengine - Isometric HTML5 JavaScript Display Engine: sheetengine v1.2.0: Main featuresOptimizations for intersectionsThe main purpose of this release was to further optimize rendering performance by skipping object intersections with other sheets. From now by default an object's sheets will only intersect its own sheets and never other static or dynamic sheets. This is the usual scenario since objects will never bump into other sheets when using collision detection. DocumentationMany of you have been asking for proper documentation, so here it goes. Check out the...DirectX Tool Kit: December 11, 2012: December 11, 2012 Ex versions of DDSTextureLoader and WICTextureLoader Removed use of ATL's CComPtr in favor of WRL's ComPtr for all platforms to support VS Express editions Updated VS 2010 project for official 'property sheet' integration for Windows 8.0 SDK Minor fix to CommonStates for Feature Level 9.1 Tweaked AlphaTestEffect.cpp to work around ARM NEON compiler codegen bug Added dxguid.lib as a default library for Debug builds to resolve GUID link issuesArcGIS Editor for OpenStreetMap: ArcGIS Editor for OSM 2.1 Final for 10.1: We are proud to announce the release of ArcGIS Editor for OpenStreetMap version 2.1. This download is compatible with ArcGIS 10.1, and includes setups for the Desktop Component, Desktop Component when 64 bit Background Geoprocessing is installed, and the Server Component. Important: if you already have ArcGIS Editor for OSM installed but want to install this new version, you will need to uninstall your previous version and then install this one. This release includes support for the ArcGIS 1...SharpCompress - a fully native C# library for RAR, 7Zip, Zip, Tar, GZip, BZip2: SharpCompress 0.8.2: This release just contains some fixes that have been done since the last release. Plus, this is strong named as well. I apologize for the lack of updates but my free time is less these days.Media Companion: MediaCompanion3.511b release: Two more bug fixes: - General Preferences were not getting restored - Fanart and poster image files were being locked, preventing changes to themVodigi Open Source Interactive Digital Signage: Vodigi Release 5.5: The following enhancements and fixes are included in Vodigi 5.5. Vodigi Administrator - Manage Music Files - Add Music Files to Image Slide Shows - Manage System Messages - Display System Messages to Users During Login - Ported to Visual Studio 2012 and MVC 4 - Added New Vodigi Administrator User Guide Vodigi Player - Improved Login/Schedule Startup Procedure - Startup Using Last Known Schedule when Disconnected on Startup - Improved Check for Schedule Changes - Now Every 15 Minutes - Pla...New Projects1Q86: testAdd Ratings to SharePoint Blog Site: This web part is an ‘administrative’ level tool used to update a Blog site to a) enable ratings on the post and b) add the User Ratings web part to the site. BadmintonBuddy: Source codeBetter Place Admin: Admin client for the BetterPlaceBooking ProjectBorkoSi: BorkoSi WebPage Source code.ChimpHD - 2D Evolved: High quality OpenCL accelerated 2D engine, capable of full complex yet easy to code 2D games. This is SpaceChimp2.0, wrote from the ground up, not just patchedCoderJoe: Repository for code samples used in my blog.CodeTemplate: ??????Data Persistent Object: ORM tool to access SQL Server, log record changes, Json and Linq supportedDevian: a simple web portal based on asp.net 2.0HD: hdHospital Tracking: hospital patient tracking ????? ???? ??????? ???? ?????Instant Get: An auction based c# applicationipangolin: DICOM stands for Digital Imaging and COmmunication in Medicine. The DICOM standard addresses the basic connectivity between different imaging devices.ISEFun: PowerShell module(s) to simplify work in it. It contains PowerShell scripts, compiled libs and some formating files. Several modules will come in one batch as optional features.Kerjasama: Aplikasi Database Kerjasama Bagian Kerjasama Kota SemarangMCEBuddy Viewer: Windows Media Center Plugin for MCEBuddy 2.x MusicForMyBlog: Link your recently played music in Itunes to your blog.My Expenses Windows Store LOB App Demo: This is a sample Windows 8 LOB app. Employees can use this app to create and submit expense reports. And managers can approve or reject those reports.Node Paint: An app based on the nodegarden application created by alphalabs.ccNPhysics: NPhysics - Physical Data Types for .NETomr.domready.js: Dom is ready? This project provides easy to detect ready event of dom.RegSecEdit: set registry security from command line, or batch file.Sitecore - Logger Module: This module provides an abstraction of the built-in Sitecore Logging features.SMART LMS: Welcome to SMART LMS, a learning management system with a difference, this software will allow teachers to create custom education activities such as quizzes.TaskManagerDD: DotNetNuke task manager tutorialThe Reactive Extensions for JavaScript: RxJS or Reactive Extensions for JavaScript is a library for transforming, composing, and querying streams of data.Timestamp: Tool for generating timestamps into clipboard for fast use.TweetGarden: Visualising connected users using their tweetsUpdateContentType: UpdateContentType atualiza os modelos de documentos utilizados por tipos de conteúdo do SharePoint a partir de documentos armazenados em uma biblioteca.User Rating Web Part for SharePoint 2010: User Rating Web Part - the fix for Ratings in SharePoint!webget: webgetwhatsnew.exe a command line utility to find new files: whatsnew.exe is a command line utility that lists the files created (new files) in a given number of days. whatsnew.exe 's syntax is very simple: whatsnew path numberofdays Also whatsnew supports other options like HTML or XML output, hyperlinked outputs and more.whoami: ip address resolverWPFReview: WPF code sample

    Read the article

  • Google+ Platform Office Hours for April 11, 2012: Recent Activity jQuery Plugin

    Google+ Platform Office Hours for April 11, 2012: Recent Activity jQuery Plugin Here is the edited video from last week's Google+ platform office hours. Discuss this video on Google+: goo.gl This week we spent the first half of the show live coding a jQuery plugin that fetched recent public activity from a Google+ profile or page for inclusion on your website. Get the source code: goo.gl 1:15 - A demo of the implemented plugin 2:04 - The design of the plugin 2:57 - The coding begins! - Use the Google+ badge config tool to discover your userId (goo.gl or follow these instructions for pages: goo.gl Q&A 17:10 - Is there any kind of beta group that I can join for Google+? - Sign up for for the publisher preview group - goo.gl 19:03 - When will the API be available? 20:11 - When will there be more moderation tools for hangouts? 21:36 - How do I get Hangouts on air? 22:18 - An update on last week's report of Google Analytics social actions not agreeing with the +1 button count 25:53 - How do I join the hangout for these office hours? 26:44 - Using the activities search API is there any way to only see new activities? 28:18 - An announcement about our future office hours schedule From: GoogleDevelopers Views: 3387 47 ratings Time: 29:29 More in Science & Technology

    Read the article

  • Advice on selecting programming languages to concentrate on? (2nd year IT security student)

    - by Tyler J Fisher
    I'm in the process of considering which programming languages I should devote the majority of my coding studies to. I'm a 2nd year CS student, majoring in IT security. What I want to do/work with: Intelligence gathering Relational databases Virus design Snort network IPS Current coding experience (what I'm going to keep): Java - intermediate HTML5 - intermediate SQL (MySQL, Oracle 11g) - basic BASH - basic I'm going to need to learn (at least) one of the following languages in order to be successful in my field. Languages to add (at least 1): Ruby (+Metasploit) C++ (virus design, low-level driver interaction, computationally intensive applications) Python (import ALL the things) My dilemma: If I diversify too broadly, I won't be able to focus on, and improve in a specific niche. Does anyone have any advice as to how I should select a language? What I'm considering + why I'm leaning towards Ruby because of Metasploit support, despite lower efficiency when compared to Python. Any suggestions based on real-world experience? Should I focus on Ruby, Python, or C++? Both Ruby, and Python have been regarded as syntactically similar to Java which my degree is based around. I'm going to be studying C++ in two years as a component of my malicious code class. Thanks, Tyler

    Read the article

  • How can i be sure that professional programming is not for me ?

    - by user17766
    Hello everybody I love programming, developing projects for hobby and learning new concepts. I am getting harder too much in current job Despite learnt many thing well. I can even hardly understand assigned tasks. I am asking why i am getting harder to myself. It may not my fault? Our architecture doesn't spend enough time to explain complicated sides of project for us or i am not enough smart one for understand fastly. Our architecture also doesn't know what kind of hell he is creating ? Seeing 3 level generic types and 4-5 level generic inheritance in domain model objects hell makes me think so really. It looks abusing concepts more than reduce complexity. Thinking that he hasn't experienced before such a big project while he is getting confused in problems of the project. May i am not in right company ? May i am not good programmer ? May i am really stupid ? Become good in programming concepts is not enough to deal big project's complications so someone should to tell me that i have to still effort too much even i am good programmer for adopting myself to any big project ? Also i had another bad experiences from previous job but my professional experiences is almost few months but i spend 2 years for learning and coding for fun and i really can say that i have well skills on OOP, Design Patterns, coding standards and deep knowledge in language currently used. Sometimes i am thinking to leave programming professionally and work in any lame job while doing programming just for hobby. Waiting suggestions and insights

    Read the article

  • New Big Data Appliance Security Features

    - by mgubar
    The Oracle Big Data Appliance (BDA) is an engineered system for big data processing.  It greatly simplifies the deployment of an optimized Hadoop Cluster – whether that cluster is used for batch or real-time processing.  The vast majority of BDA customers are integrating the appliance with their Oracle Databases and they have certain expectations – especially around security.  Oracle Database customers have benefited from a rich set of security features:  encryption, redaction, data masking, database firewall, label based access control – and much, much more.  They want similar capabilities with their Hadoop cluster.    Unfortunately, Hadoop wasn’t developed with security in mind.  By default, a Hadoop cluster is insecure – the antithesis of an Oracle Database.  Some critical security features have been implemented – but even those capabilities are arduous to setup and configure.  Oracle believes that a key element of an optimized appliance is that its data should be secure.  Therefore, by default the BDA delivers the “AAA of security”: authentication, authorization and auditing. Security Starts at Authentication A successful security strategy is predicated on strong authentication – for both users and software services.  Consider the default configuration for a newly installed Oracle Database; it’s been a long time since you had a legitimate chance at accessing the database using the credentials “system/manager” or “scott/tiger”.  The default Oracle Database policy is to lock accounts thereby restricting access; administrators must consciously grant access to users. Default Authentication in Hadoop By default, a Hadoop cluster fails the authentication test. For example, it is easy for a malicious user to masquerade as any other user on the system.  Consider the following scenario that illustrates how a user can access any data on a Hadoop cluster by masquerading as a more privileged user.  In our scenario, the Hadoop cluster contains sensitive salary information in the file /user/hrdata/salaries.txt.  When logged in as the hr user, you can see the following files.  Notice, we’re using the Hadoop command line utilities for accessing the data: $ hadoop fs -ls /user/hrdataFound 1 items-rw-r--r--   1 oracle supergroup         70 2013-10-31 10:38 /user/hrdata/salaries.txt$ hadoop fs -cat /user/hrdata/salaries.txtTom Brady,11000000Tom Hanks,5000000Bob Smith,250000Oprah,300000000 User DrEvil has access to the cluster – and can see that there is an interesting folder called “hrdata”.  $ hadoop fs -ls /user Found 1 items drwx------   - hr supergroup          0 2013-10-31 10:38 /user/hrdata However, DrEvil cannot view the contents of the folder due to lack of access privileges: $ hadoop fs -ls /user/hrdata ls: Permission denied: user=drevil, access=READ_EXECUTE, inode="/user/hrdata":oracle:supergroup:drwx------ Accessing this data will not be a problem for DrEvil. He knows that the hr user owns the data by looking at the folder’s ACLs. To overcome this challenge, he will simply masquerade as the hr user. On his local machine, he adds the hr user, assigns that user a password, and then accesses the data on the Hadoop cluster: $ sudo useradd hr $ sudo passwd $ su hr $ hadoop fs -cat /user/hrdata/salaries.txt Tom Brady,11000000 Tom Hanks,5000000 Bob Smith,250000 Oprah,300000000 Hadoop has not authenticated the user; it trusts that the identity that has been presented is indeed the hr user. Therefore, sensitive data has been easily compromised. Clearly, the default security policy is inappropriate and dangerous to many organizations storing critical data in HDFS. Big Data Appliance Provides Secure Authentication The BDA provides secure authentication to the Hadoop cluster by default – preventing the type of masquerading described above. It accomplishes this thru Kerberos integration. Figure 1: Kerberos Integration The Key Distribution Center (KDC) is a server that has two components: an authentication server and a ticket granting service. The authentication server validates the identity of the user and service. Once authenticated, a client must request a ticket from the ticket granting service – allowing it to access the BDA’s NameNode, JobTracker, etc. At installation, you simply point the BDA to an external KDC or automatically install a highly available KDC on the BDA itself. Kerberos will then provide strong authentication for not just the end user – but also for important Hadoop services running on the appliance. You can now guarantee that users are who they claim to be – and rogue services (like fake data nodes) are not added to the system. It is common for organizations to want to leverage existing LDAP servers for common user and group management. Kerberos integrates with LDAP servers – allowing the principals and encryption keys to be stored in the common repository. This simplifies the deployment and administration of the secure environment. Authorize Access to Sensitive Data Kerberos-based authentication ensures secure access to the system and the establishment of a trusted identity – a prerequisite for any authorization scheme. Once this identity is established, you need to authorize access to the data. HDFS will authorize access to files using ACLs with the authorization specification applied using classic Linux-style commands like chmod and chown (e.g. hadoop fs -chown oracle:oracle /user/hrdata changes the ownership of the /user/hrdata folder to oracle). Authorization is applied at the user or group level – utilizing group membership found in the Linux environment (i.e. /etc/group) or in the LDAP server. For SQL-based data stores – like Hive and Impala – finer grained access control is required. Access to databases, tables, columns, etc. must be controlled. And, you want to leverage roles to facilitate administration. Apache Sentry is a new project that delivers fine grained access control; both Cloudera and Oracle are the project’s founding members. Sentry satisfies the following three authorization requirements: Secure Authorization:  the ability to control access to data and/or privileges on data for authenticated users. Fine-Grained Authorization:  the ability to give users access to a subset of the data (e.g. column) in a database Role-Based Authorization:  the ability to create/apply template-based privileges based on functional roles. With Sentry, “all”, “select” or “insert” privileges are granted to an object. The descendants of that object automatically inherit that privilege. A collection of privileges across many objects may be aggregated into a role – and users/groups are then assigned that role. This leads to simplified administration of security across the system. Figure 2: Object Hierarchy – granting a privilege on the database object will be inherited by its tables and views. Sentry is currently used by both Hive and Impala – but it is a framework that other data sources can leverage when offering fine-grained authorization. For example, one can expect Sentry to deliver authorization capabilities to Cloudera Search in the near future. Audit Hadoop Cluster Activity Auditing is a critical component to a secure system and is oftentimes required for SOX, PCI and other regulations. The BDA integrates with Oracle Audit Vault and Database Firewall – tracking different types of activity taking place on the cluster: Figure 3: Monitored Hadoop services. At the lowest level, every operation that accesses data in HDFS is captured. The HDFS audit log identifies the user who accessed the file, the time that file was accessed, the type of access (read, write, delete, list, etc.) and whether or not that file access was successful. The other auditing features include: MapReduce:  correlate the MapReduce job that accessed the file Oozie:  describes who ran what as part of a workflow Hive:  captures changes were made to the Hive metadata The audit data is captured in the Audit Vault Server – which integrates audit activity from a variety of sources, adding databases (Oracle, DB2, SQL Server) and operating systems to activity from the BDA. Figure 4: Consolidated audit data across the enterprise.  Once the data is in the Audit Vault server, you can leverage a rich set of prebuilt and custom reports to monitor all the activity in the enterprise. In addition, alerts may be defined to trigger violations of audit policies. Conclusion Security cannot be considered an afterthought in big data deployments. Across most organizations, Hadoop is managing sensitive data that must be protected; it is not simply crunching publicly available information used for search applications. The BDA provides a strong security foundation – ensuring users are only allowed to view authorized data and that data access is audited in a consolidated framework.

    Read the article

  • New release of Microsoft All-In-One Code Framework is available for download - March 2011

    - by Jialiang
    A new release of Microsoft All-In-One Code Framework is available on March 8th. Download address: http://1code.codeplex.com/releases/view/62267#DownloadId=215627 You can download individual code samples or browse code samples grouped by technology in the updated code sample index. If it’s the first time that you hear about Microsoft All-In-One Code Framework, please read this Microsoft News Center article http://www.microsoft.com/presspass/features/2011/jan11/01-13codeframework.mspx, or watch the introduction video on YouTube http://www.youtube.com/watch?v=cO5Li3APU58, or read the introduction on our homepage http://1code.codeplex.com/. -------------- New Silverlight code samples CSSLTreeViewCRUDDragDrop Download: http://1code.codeplex.com/releases/view/62253#DownloadId=215808 The code sample was created by Amit Dey. It demonstrates a custom TreeView with added functionalities of CRUD (Create, Read, Update, Delete) and drag-and-drop operations. Silverlight TreeView control with CRUD and drag & drop is a frequently asked programming question in Silverlight  forums. Many customers also requested this code sample in our code sample request service. We hope that this sample can reduce developers' efforts in handling this typical programming scenario. The following blog article introduces the sample in detail: http://blogs.msdn.com/b/codefx/archive/2011/02/15/silverlight-treeview-control-with-crud-and-drag-amp-drop.aspx. CSSL4FileDragDrop and VBSL4FileDragDrop Download: http://1code.codeplex.com/releases/view/62253#DownloadId=215809 http://1code.codeplex.com/releases/view/62253#DownloadId=215810 The code sample demonstrates the new drag&drop feature of Silverlight 4 to implement dragging picures from the local file system to a Silverlight application.   Sometimes we want to change SiteMapPath control's titles and paths according to Query String values. And sometimes we want to create the SiteMapPath dynamically. This code sample shows how to achieve these goals by handling SiteMap.SiteMapResolve event. CSASPNETEncryptAndDecryptConfiguration, VBASPNETEncryptAndDecryptConfiguration Download: http://1code.codeplex.com/releases/view/62253#DownloadId=215027 http://1code.codeplex.com/releases/view/62253#DownloadId=215106 In this sample, we encrypt and decrypt some sensitive information in the config file of a web application by using the RSA asymmetric encryption. This project contains two snippets. The first one demonstrates how to use RSACryptoServiceProvider to generate public key and the corresponding private key and then encrypt/decrypt string value on page. The second part shows how to use RSA configuration provider to encrypt and decrypt configuration section in web.config of web application. connectionStrings section in plain text: Encrypted connectionString:  Note that if you store sensitive data in any of the following configuration sections, we cannot encrypt it by using a protected configuration provider <processModel> <runtime> <mscorlib> <startup> <system.runtime.remoting> <configProtectedData> <satelliteassemblies> <cryptographySettings> <cryptoNameMapping> CSASPNETFileUploadStatus Download: http://1code.codeplex.com/releases/view/62253#DownloadId=215028 I believe ASP.NET programmers will like this sample, because in many cases we need customers know the current status of the uploading files, including the upload speed and completion percentage and so on. Under normal circumstances, we need to use COM components to accomplish this function, such as Flash, Silverlight, etc. The uploading data can be retrieved in two places, the client-side and the server-side. For the client, for the safety factors, the file upload status information cannot be got from JavaScript or server-side code, so we need COM component, like Flash and Silverlight to accomplish this, I do not like this approach because the customer need to install these components, but also we need to learn another programming framework. For the server side, we can get the information through coding, but the key question is how to tell the client results. In this case, We will combine custom HTTPModule and AJAX technology to illustrate how to analyze the HTTP protocol, how to break the file request packets, how to customize the location of the server-side file caching, how to return the file uploading status back to the client and so on . CSASPNETHighlightCodeInPage, VBASPNETHighlightCodeInPage Download: http://1code.codeplex.com/releases/view/62253#DownloadId=215029 http://1code.codeplex.com/releases/view/62253#DownloadId=215108 This sample imitates a system that needs display the highlighted code in an ASP.NET page . As a matter of fact, sometimes we input code like C# or HTML in a web page and we need these codes to be highlighted for a better reading experience. It is convenient for us to keep the code in mind if it is highlighted. So in this case, the sample shows how to highlight the code in an ASP.NET page. It is not difficult to highlight the code in a web page by using String.Replace method directly. This  method can return a new string in which all occurrences of a specified string in the current instance are replaced with another specified string. However, it may not be a good idea, because it's not extremely fast, in fact, it's pretty slow. In addition, it is hard to highlight multiple keywords by using String.Replace method directly. Sometimes we need to copy source code from visual studio to a web page, for readability purpose, highlight the code is important while set the different types of keywords to different colors in a web page by using String.Replace method directly is not available. To handle this issue, we need to use a hashtable variable to store the different languages of code and their related regular expressions with matching options. Furthermore, define the css styles which used to highlight the code in a web page. The sample project can auto add the style object to the matching string of code. A step-by-step guide illustrating how to highlight the code in an ASP.NET page: 1. the HighlightCodePage.aspx page Choose a type of language in the dropdownlist control and paste the code in the textbox control, then click the HighLight button. 2.  Display the highlighted code in an ASP.NET page After user clicks the HighLight button, the highlighted code will be displayed at right side of the page.        CSASPNETPreventMultipleWindows Download: http://1code.codeplex.com/releases/view/62253#DownloadId=215032 This sample demonstrates a step-by-step guide illustrating how to detect and prevent multiple windows or tab usage in Web Applications. The sample imitates a system that need to prevent multiple windows or tabs to solve some problems like sharing sessions, protect duplicated login, data concurrency, etc. In fact, there are many methods achieving this goal. Here we give a solution of use JavaScript, Sample shows how to use window.name property check the correct links and throw other requests to invalid pages. This code-sample use two user controls to make a distinction between base page and target page, user only need drag different controls to appropriate web form pages. so user need not write repetitive code in every page, it will make coding work lightly and convenient for modify your code.  JSVirtualKeyboard Download: http://1code.codeplex.com/releases/view/62253#DownloadId=215093 This article describes an All-In-One framework sample that demonstrates a step-by-step guide illustrating how to build a virtual keyboard in your HTML page. Sometimes we may need to offer a virtual keyboard to let users input something without their real keyboards. This scenario often occurs when users will enter their password to get access to our sites and we want to protect the password from some kinds of back-door software, a Key-logger for example, and we will find a virtual keyboard on the page will be a good choice here. To create a virtual keyboard, we firstly need to add some buttons to the page. And when users click on a certain button, the JavaScript function handling the onclick event will input an appropriated character to the textbox. That is the simple logic of this feature. However, if we indeed want a virtual keyboard to substitute for the real keyboard completely, we will need more advanced logic to handle keys like Caps-Lock and Shift etc. That will be a complex work to achieve. CSASPNETDataListImageGallery Download: http://1code.codeplex.com/releases/view/62261#DownloadId=215267 This code sample demonstrates how to create an Image Gallery application by using the DataList control in ASP.NET. You may find the Image Gallery is widely used in many social networking sites, personal websites and E-Business websites. For example, you may use the Image Gallery to show a library of personal uploaded images on a personal website. Slideshow is also a popular tool to display images on websites. This code sample demonstrates how to use the DataList and ImageButton controls in ASP.NET to create an Image Gallery with image navigation. You can click on a thumbnail image in the Datalist control to display a larger version of the image on the page. This sample code reads the image paths from a certain directory into a FileInfo array. Then, the FileInfo array is used to populate a custom DataTable object which is bound to the Datalist control. This code sample also implements a custom paging system that allows five images to be displayed horizontally on one page. The following link buttons are used to implement a custom paging system:   •     First •     Previous •     Next •     Last Note We recommend that you use this method to load no more than five images at a time. You can also set the SelectedIndex property for the DataList control to limit the number of the thumbnail images that can be selected. To indicate which image is selected, you can set the SelectedStyle property for the DataList control. VBASPNETSearchEngine Download: http://1code.codeplex.com/releases/view/62253#DownloadId=215112 This sample shows how to implement a simple search engine in an ASP.NET web site. It uses LIKE condition in SQL statement to search database. Then it highlights keywords in search result by using Regular Expression and JavaScript. New Windows General code samples CSCheckEXEType, VBCheckEXEType Downloads: http://1code.codeplex.com/releases/view/62253#DownloadId=215045 http://1code.codeplex.com/releases/view/62253#DownloadId=215120 The sample demonstrates how to check an executable file type.  For a given executable file, we can get 1 whether it is a console application 2 whether it is a .Net application 3 whether it is a 32bit native application. 4 The full display name of a .NET application, e.g. System, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089, processorArchitecture=MSIL New Internet Explorer code samples CSIEExplorerBar, VBIEExplorerBar Downloads: http://1code.codeplex.com/releases/view/62253#DownloadId=215060 http://1code.codeplex.com/releases/view/62253#DownloadId=215133 The sample demonstrates how to create and deploy an IE Explorer Bar which could list all the images in a web page. CSBrowserHelperObject, VBBrowserHelperObject Downloads: http://1code.codeplex.com/releases/view/62253#DownloadId=215044 http://1code.codeplex.com/releases/view/62253#DownloadId=215119 The sample demonstrates how to create and deploy a Browser Helper Object,  and the BHO in this sample is used to disable the context menu in IE. New Windows Workflow Foundation code samples CSWF4ActivitiesCorrelation Download: http://1code.codeplex.com/releases/view/62253#DownloadId=215085 Consider that there are two such workflow instances:       start                                   start          |                                           | Receive activity      Receive activity         |                                           | Receive2 activity      Receive2 activity         |                                           | A WCF request comes to call the second Receive2 activity. Which one should take care of the request? The answer is Correlation. This sample will show you how to correlate two workflow service to work together. -------------- New ASP.NET code samples CSASPNETBreadcrumbWithQueryString Download: http://1code.codeplex.com/releases/view/62253#DownloadId=215022

    Read the article

  • Teaching myself, as a physicist, to become a better programmer

    - by user787267
    I've always liked physics, and I've always liked coding, so when I got the offer for a PhD position doing numerical physics (details are not relevant, it's mostly parallel programming for a cluster) at a university, it was a no-brainer for me. However, as most physicists, I'm self taught. I don't have broad background knowledge about how to code in an object oriented way, or the name of that specific algorithm that optimizes the search in some kD tree. Since all my work so far has been more concerned about the physics and the scientific results, I undoubtedly have some bad habits - more so because my coding is my own, and not really teamwork. I have mostly used C since it is very straightforward and "what you write is what you get" - no need for fancy abstractions. However, I have recently switched to C++ since I'd like to learn more about the power that comes with abstraction, and it's pretty C-like (syntax-wise at least). How do I teach myself to code in a good, abstract way like a graduate in computer science? I know my code is efficient, but I want it to be elegant as well, and readable. Keep in mind that I don't have time to read several 1000-page tomes about abstract programming. I need to spend time on actual, physics related research (my supervisor would laugh at me if he knew I spent time thinking about how to program elegantly). How do I assess if my work is also good from a programmer's perspective?

    Read the article

  • How to handle people who lie on their resume [closed]

    - by Juliet
    Moderator comment Please note that this is a two year old question that has just been migrated from Stack Overflow. Please take your time to read all the answers and ask yourself "would my answer add anything to this?". I'm conducting technical interviews to fill a few .NET positions. Many of the people I interview really do know .NET pretty well, but I find at least 90% embellish their skillset anywhere between "a little" to "quite drastically". Sometimes they fabricate skills relevant to the position they're applying for, sometimes they don't. Most of the people I interview, even the most egregious liars, are not scam artists. They just want to stand out among the crowd, so they drop a few buzzwords on their resume like "JBoss", "LINQ", "web services", "Django" or whatever just to pad their skillset and stay competitive. (You might wonder if a person that lies about those skills is just bluffing their way through a technical interview. My interviews involve a lot of hands-on coding and problem-solving – people who attempt to bluff will bomb the hands-on coding portion in the first 3 minutes.) These are two open-ended questions, but it would really help me out when I make my recommendations to the hiring managers: Regarding interviewing etiquette, should I attempt to determine whether a person really possesses all of the skills they claim to have? Can I do this without making the candidate feel uncomfortable? Regarding the final decision, should I recommend candidates who are genuinely qualified for the positions they're applying for, even if they've fabricated portions of their skillset?

    Read the article

  • CodePlex Daily Summary for Friday, October 11, 2013

    CodePlex Daily Summary for Friday, October 11, 2013Popular ReleasesStackBuilder: StackBuilder 1.0.17.0: Corrected minor bug in box/case analysisPowerShell App Deployment Toolkit: PowerShell App Deployment Toolkit v3.0.6: Added PersistPrompt parameter to Show-InstallationWelcome and Show-InstallationPrompt. Prompt window is persistently returned to center screen after interval specified in config file (default 10 seconds). For Show-InstallationWelcome, this only takes effect if deferral is not available to user. The user will have no option but to respond to the prompt - resistance is futile! Added example advanced Office 2010 deployment script Asynchronous actions now write to the same log file as synchro...XrmServiceToolkit - A Microsoft Dynamics CRM 2011 & CRM 2013 JavaScript Library: XrmServiceToolkit 2.0.0 for CRM 2011 and CRM 2013: Version: 2.0.0 (beta) Date: October, 2013 Dependency: JSON2, jQuery (latest or 1.7.2 above) ---NOTE---Due to the changes for CRM 2013, please use the attached version of JSON2 and jQuery Tested Platform: IE10, latest Chrome, latest FireFox Tested CRM Server: CRM 2013 on-premise RTM, CRM 2013 online, CRM 2011 with RU15 Changes: New Behavior - XrmServiceTookit.Soap.Fetch parameters change to work with asynchronous callback compared to 1.4.2 beta: XrmS...MoreTerra (Terraria World Viewer): MoreTerra 1.11.3: =========== =New Features= =========== New Markers added for Plantera's Bulb, Heart Fruits and Gold Cache. Markers now correctly display for the gems found in rock debris on the floor. =========== =Compatibility= =========== Fixed header changes found in Terraria 1.0.3.1Dynamics AX 2012 R2 Kitting: First Beta release of Kitting: First Beta release of Kitting Install by using XPO or Models.C# Intellisense for Notepad++: Release v1.0.8.0: - fixed document formatting artifacts To avoid the DLLs getting locked by OS use MSI file for the installation.CS-Script for Notepad++ (C# intellisense and code execution): Release v1.0.8.0: - fixed document formatting artifacts To avoid the DLLs getting locked by OS use MSI file for the installation.Generic Unit of Work and Repositories Framework: v2.0: Async methods for Repostiories - Ivan (@ifarkas) OData Async - Ivan (@ifarkas) Glimpse MVC4 workig with MVC5 Glimpse EF6 Northwind.Repostiory Project (layer) best practices for extending the Repositories Northwind.Services Project (layer), best practices for implementing business facade Live Demo: http://longle.azurewebsites.net/Spa/Product#/list Documentation: http://blog.longle.net/2013/10/09/upgrading-to-async-with-entity-framework-mvc-odata-asyncentitysetcontroller-kendo-ui-gli...Media Companion: Media Companion MC3.581b: Fix in place for TVDB xml issue. New* Movie - General Preferences, allow saving of ignored 'The' or 'A' to end of movie title, stored in sorttitle field. * Movie - New Way for Cropping Posters. Fixed* Movie - Rename of folders/filename. caught error message. * Movie - Fixed Bug in Save Cropped image, only saving in Pre-Frodo format if Both model selected. * Movie - Fixed Cropped image didn't take zoomed ratio into effect. * Movie - Separated Folder Renaming and File Renaming fuctions durin...Ghostscript.NET: Ghostscript.NET v.1.1.1.: v.1.1.1. fixed problem in GhostscriptRasterizer and GhostscriptViewer when MediaBox contains negative llx or lly values. (problem reported by "Prasenjit Das"). added GhostscriptPngDevice, a friendly output device class with all png devices related switches. (GhostscriptPngDevice supports: png16m, pngalpha, pnggray, png256, png16, pngmono, pngmonod). added GhostscriptJpegDevice, a friendly output device class with all jpeg devices related switches. (GhostscriptJpegDevice supports: jpeg, jp...xFunc: xFunc 2.7.3: Fixed memory leak. Small fixes.SmartStore.NET - Free ASP.NET MVC Ecommerce Shopping Cart Solution: SmartStore.NET 1.2.0: HighlightsMulti-store support "Trusted Shops" plugins Highly improved SmartStore.biz Importer plugin Add custom HTML content to pages Performance optimization New FeaturesMulti-store-support: now multiple stores can be managed within a single application instance (e.g. for building different catalogs, brands, landing pages etc.) Added 3 new Trusted Shops plugins: Seal, Buyer Protection, Store Reviews Added Display as HTML Widget to CMS Topics (store owner now can add arbitrary HT...Fast YouTube Downloader: Youtube Downloader 2.1: Youtube Downloader 2.1NuGet: NuGet 2.7.1: Released October 07, 2013. Release notes: http://docs.nuget.org/docs/release-notes/nuget-2.7.1 Important note: After downloading the signed build of NuGet.exe, if you perform an update using the "nuget.exe update -self" command, it will revert back to the unsigned build.Mugen MVVM Toolkit: Mugen MVVM Toolkit 2.0: IntroductionMugen MVVM Toolkit makes it easier to develop Silverlight, WPF, WinRT and WP applications using the Model-View-ViewModel design pattern. The purpose of the toolkit is to provide a simple framework and set of tools for getting up to speed quickly with applications based on the MVVM design pattern. The core of Toolkit contains a navigation system, windows management system, models, validation, etc. Mugen MVVM Toolkit contains all the MVVM classes such as ViewModelBase, RelayCommand,...Office Ribbon Project (under active development): Ribbon (07. Oct. 2013): Fixed Scrollbar Bug if DropDown Button is bigger than screen Added Office 2013 Theme Fixed closing the Ribbon caused a null reference exception in the RibbonButton.Dispose if the DropDown was not created yet Fixed Memory leak fix (unhooked events after Dispose) Fixed ToolStrip Selected Text 2013 and 2007 for Blue and Standard themescmdradio: v0.1.1: Default download in win32. For other OS see here. This is alpha version. Please report all bugs.Squiggle - A free open source LAN Messenger: Squiggle 3.3 Alpha: Allow using environment variables in configuration file (history db connection string, download folder location, display name, group and message) Fix for history viewer to show the correct history entries History saved with UTC timestamp This is alpha release and not recommended for use in productionVidCoder: 1.5.7 Beta: Updated HandBrake core to SVN 4819. About dialog now pulls down HandBrake version from the DLL. Added a confirmation dialog to Stop if the encode has been going on for more than 5 minutes. Fixed handling of unicode characters for input and output filenames. We now encode to UTF-8 before passing to HandBrake. Fixed a crash in the queue multiple titles dialog. Added code to rescue tool windows which get placed outside of the visible screen area.Wsus Package Publisher: Release v1.3.1310.05: Enhance the "Reboot Remote Computers", by adding a timer before the reboot occure. So that remote users can save their documents and close applications. You can also add a message to be display. In 'Tools'->'Settings'-> Misc Tab, you can set a default message. Enhance the "Compare Computers against AD", by choosing OUs to include in the comparison.New ProjectsA fork of the RPG Stendhal: Stendhal is a multiplayer online adventures game developed using the Arianne game development system. Stendhal is written using Java 6 and Java2D environment.Board Game Up: Welcome to the codeplex of the Board Game Up website. This contains the entire source code for www.boardgameup.com, the free board game meet up website.Business Information Organizer System: Goal every aspect of small business: services, inventory, education, finance, and heath. Must support every language and country.CRM 2011 to 2013 Icon Upgrader: This application allows you to automatically upgrade CRM 2011 icons to CRM 2013 icons. This application converts entity icons into grey scale.Data Access: CQRS Highway: Command Query Responsibility Segregation pattern for data access.djtest001: this is a summary Test1DNN Evoq Social Invitation Registration: Invitation and Registration module for DNN Evoq Social.Dynamics AX 2012 R2 Kitting: Dynamics AX 2012 R2 - Kitting. A solution for creating and using Kit's on salesorders.IlluminatiEngine: This is a 3D deferred lighting engine, it has skinned mesh animation as well as instanced mesh (including skinned) and a few post processing effects, DoF, etc..jean1011jabbrchang: wJessWork: for now let us make it emptyKiller Core: Starquake re makeKinect Stream Saver Application: The application allows users to display and save the Kinect data streams. The Kinect Stream Saver application is based on the KinectExplorer-D2D ( C++) sample.LUA for Windows CE: LUA porting to Windows CE devices.OpenDespatch for Peoplevox: OpenDespatch is an add-on for Peoplevox to complete despatch using external carriers such as DPD and RM-DMO. Prontuário Eletrônico: Projeto para cadeira de Projeto de Desenvolvimento de software do curso de sistemas de informação que visa a unificação do prontuário médicoRiver Niles Operating System: The project is a research of C# language program , that will be cover to a full Operating SystemSocketIO4WinRT.Client: This is a port of the SocketIO4Net.Client project, to run on WinRT (Windows 8) for C# users.SQL Data: SQL Data is an ORM for C# without any of the bloat.Task Factory Bug Sample: This project is used to demonstrate a bug in .NET 4.0 Runtime for Task Factory.Team Manticore Student System: Student system applicationVccDryad: VccDryad aims at extending VCC with proof strategies that enable automated verification of C programs that manipulate complex data structures. X0oo0Z: X0oo0ZYT: mouse move

    Read the article

  • multi user web game with scheduled processing?

    - by Rooq
    I have an idea for a game which I am in the process of designing, but I am struggling to establish if the way I plan to implement it is possible. The game is a text based sports management simulation. This will require players to take certain actions through a web browser which will interact with a database - adding/updating and selecting. Most of the code required to be executed at this point will be fairly straightforward. The main processing will take place by applications which are scheduled to run on the server at certain times. These apps will process transactions added by the players and also perform some automatic processing based on the game date. My plan was to use an SQL server database (at last count I require about 20 tables) and VB.net for all the coding (coming from a mainframe programming background this language is the simplist for me to get to grips with). I will also need a scheduling tool on the server. Can anyone tell me if what I am planning is feasible before I dive into the actual coding stage of my project?

    Read the article

  • I'm learning html and I'm confused as to how href's work. [migrated]

    - by Robolisk
    Okay so i'm learning html right now and soon css. In my html coding I have a section like this for navigation: <div id="header"> <h1>Guild Wars 2 Fanbase</h1> <ol id="navigation"> <li><a href="/">Home</a></li> <li><a href="/facts">Facts</a></li> <li><a href="/gallery">Gallery</a></li> <li><a href="/code">Coding</a> <ul><li><a href="/code/line">Lines</a></li> <li><a href="/code/comment">Comment Lines</a></li> </ul> </li> </ol></div> Now when I open up this .html file this is all layed out the way I want it too look (the mark up that is). My question is this, when I click on a link on this site (this site being this code) I get an a error saying this webpage is not found, but of course. But how do I create it so I can have the web pages working together? I'm not sure how to word it correctly. Like, do I create another .html file in the same directory so somehow when I click the link it reads from the second .html file? If you not sure what I'm asking, just let me know and I'll try to be more specific. Thank you for your help (: excuse my mistakes in grammar, not the worlds best in English, trying my best (:

    Read the article

  • ArchBeat Link-o-Rama for 11/30/2011

    - by Bob Rhubart
    Coding - the new Latin | @BBCRoryCJ BBC Technology Correspondent Rory Cellan-Jones reports on why "the campaign to boost the teaching of computer skills - particularly coding - in schools is gathering force." BPM Business Value Patterns | SOA Partner Community Blog Juergen Kress shares the presentation he and Matthias Ziegler from Accenture delivered at the SOA & BPM Integration Days event in Germany in October. Coherence 3.7.1 Resources Busy blogger Juergen Kress shares links to screencasts and other resources for those interested in Oracle Coherence 3.7.1. OBIEE 11.1.1 - Introduction to OBIEE 11g Full Sample App "The OBIEE 11g Full Sample App (FSA) is a comprehensive collection of examples designed to demonstrate the latest Oracle BIEE 11g capabilities and design best practices." Solaris 11 Customer Maintenance Lifecycle | Gerry Haskins Gerry Haskins launches a new blog devoted to Solaris "policies, best practices, clarifications, and lots of other stuff." Harnessing Business Events for Predictive Decision Making - part 1 / 3 | Sanjeev Sharma "Data growth is outpacing storage capacity by a factor of two and computing power is still very much bounded by Moore's Law, doubling only every 18 months," says Sanjeev Sharma. The Latest Research from the SEI | Douglas C. SchmidtSchmidt shares information on several recently published Software Engineering Institute (SEI) technical reports that "highlight the latest work of SEI technologists in Agile methods, insider threat,the SMART Grid Maturity Model, acquisition, and CMMI." Tiger/Line Shape Files and Oracle | Bradley D. Brown "Have you ever needed to load an ESRI "shape file" and wondered if that's an easy effort or a difficult effort? I know I have and I assumed that it was a pretty difficult effort. However, I learned today that's actually pretty easy!" -- Oracle ACE Director Bradley Brown of TUSC. Webcast: Enterprise Clouds with Oracle VM Tuesday, December 6, 2011, 9:00 am PT / Noon ET. Featuring Adam Hawley (Senior Director of Product Management, Oracle) and Dan Herrup (Principal Systems Engineer, Oracle Corporate Citizenship). SOA Made Simple; Architects in AZ; Cloud Migration Introduction This week on the Architect Home Page on OTN.

    Read the article

  • New college grad, psychology major, wants to code professionally. Should I get Sun Java-certified?

    - by Anita
    I just graduated from a fairly well-known liberal arts college in May. Interestingly, I majored in psychology, with a concentration in social psychology. In college I took Intro to Computer Science and hated it (used to blame it on myself; now I blame it on the professor :) However, I've always wanted to be a programmer, and finally got my wish by getting hired by a company that was willing to let me learn coding from scratch in exchange for low pay. Well, what do you know, I just got laid off this morning, and need a new job by November to pay the bills. I loved the coding part of my job at the company, and managed to learn enough Java to feel competent in the job and curious to learn more. I think my goal now is to become a professional programmer. I still know very little (never used Swing, for example) but nothing that a good book can't fix. That's the background anyway; sorry for the rambling - I'm still in shock from the layoff :( It seems to me the quickest way to get noticed by companies, without a CS degree, is by getting certification. I'm halfway through studying for the SCJP and can probably sit for an exam in a week or two. Am I right in my assumption that certs will help in my case? And in general, do I have a bat's chance in hell of making it against formally trained programmers? My assets are really just raw intelligence and intense curiosity; well, maybe a love for problem-solving too. Thanks all - feel free to edit/tag the post!

    Read the article

  • How Visual Studio 2010 and Team Foundation Server enable Compliance

    - by Martin Hinshelwood
    One of the things that makes Team Foundation Server (TFS) the most powerful Application Lifecycle Management (ALM) platform is the traceability it provides to those that use it. This traceability is crucial to enable many companies to adhere to many of the Compliance regulations to which they are bound (e.g. CFR 21 Part 11 or Sarbanes–Oxley.)   From something as simple as relating Tasks to Check-in’s or being able to see the top 10 files in your codebase that are causing the most Bugs, to identifying which Bugs and Requirements are in which Release. All that information is available and more in TFS. Although all of this tradability is available within TFS you do need to understand that it is not for free. Well… I say that, but if you are using TFS properly you will have this information with no additional work except for firing up the reporting. Using Visual Studio ALM and Team Foundation Server you can relate every line of code changes all the way up to requirements and back down through Test Cases to the Test Results. Figure: The only thing missing is Build In order to build the relationship model below we need to examine how each of the relationships get there. Each member of your team from programmer to tester and Business Analyst to Business have their roll to play to knit this together. Figure: The relationships required to make this work can get a little confusing If Build is added to this to relate Work Items to Builds and with knowledge of which builds are in which environments you can easily identify what is contained within a Release. Figure: How are things progressing Along with the ability to produce the progress and trend reports the tractability that is built into TFS can be used to fulfil most audit requirements out of the box, and augmented to fulfil the rest. In order to understand the relationships, lets look at each of the important Artifacts and how they are associated with each other… Requirements – The root of all knowledge Requirements are the thing that the business cares about delivering. These could be derived as User Stories or Business Requirements Documents (BRD’s) but they should be what the Business asks for. Requirements can be related to many of the Artifacts in TFS, so lets look at the model: Figure: If the centre of the world was a requirement We can track which releases Requirements were scheduled in, but this can change over time as more details come to light. Figure: Who edited the Requirement and when There is also the ability to query Work Items based on the History of changed that were made to it. This is particularly important with Requirements. It might not be enough to say what Requirements were completed in a given but also to know which Requirements were ever assigned to a particular release. Figure: Some magic required, but result still achieved As an augmentation to this it is also possible to run a query that shows results from the past, just as if we had a time machine. You can take any Query in the system and add a “Asof” clause at the end to query historical data in the operational store for TFS. select <fields> from WorkItems [where <condition>] [order by <fields>] [asof <date>] Figure: Work Item Query Language (WIQL) format In order to achieve this you do need to save the query as a *.wiql file to your local computer and edit it in notepad, but one imported into TFS you run it any time you want. Figure: Saving Queries locally can be useful All of these Audit features are available throughout the Work Item Tracking (WIT) system within TFS. Tasks – Where the real work gets done Tasks are the work horse of the development team, but they only as useful as Excel if you do not relate them properly to other Artifacts. Figure: The Task Work Item Type has its own relationships Requirements should be broken down into Tasks that the development team work from to build what is required by the business. This may be done by a small dedicated group or by everyone that will be working on the software team but however it happens all of the Tasks create should be a Child of a Requirement Work Item Type. Figure: Tasks are related to the Requirement Tasks should be used to track the day-to-day activities of the team working to complete the software and as such they should be kept simple and short lest developers think they are more trouble than they are worth. Figure: Task Work Item Type has a narrower purpose Although the Task Work Item Type describes the work that will be done the actual development work involves making changes to files that are under Source Control. These changes are bundled together in a single atomic unit called a Changeset which is committed to TFS in a single operation. During this operation developers can associate Work Item with the Changeset. Figure: Tasks are associated with Changesets   Changesets – Who wrote this crap Changesets themselves are just an inventory of the changes that were made to a number of files to complete a Task. Figure: Changesets are linked by Tasks and Builds   Figure: Changesets tell us what happened to the files in Version Control Although comments can be changed after the fact, the inventory and Work Item associations are permanent which allows us to Audit all the way down to the individual change level. Figure: On Check-in you can resolve a Task which automatically associates it Because of this we can view the history on any file within the system and see how many changes have been made and what Changesets they belong to. Figure: Changes are tracked at the File level What would be even more powerful would be if we could view these changes super imposed over the top of the lines of code. Some people call this a blame tool because it is commonly used to find out which of the developers introduced a bug, but it can also be used as another method of Auditing changes to the system. Figure: Annotate shows the lines the Annotate functionality allows us to visualise the relationship between the individual lines of code and the Changesets. In addition to this you can create a Label and apply it to a version of your version control. The problem with Label’s is that they can be changed after they have been created with no tractability. This makes them practically useless for any sort of compliance audit. So what do you use? Branches – And why we need them Branches are a really powerful tool for development and release management, but they are most important for audits. Figure: One way to Audit releases The R1.0 branch can be created from the Label that the Build creates on the R1 line when a Release build was created. It can be created as soon as the Build has been signed of for release. However it is still possible that someone changed the Label between this time and its creation. Another better method can be to explicitly link the Build output to the Build. Builds – Lets tie some more of this together Builds are the glue that helps us enable the next level of tractability by tying everything together. Figure: The dashed pieces are not out of the box but can be enabled When the Build is called and starts it looks at what it has been asked to build and determines what code it is going to get and build. Figure: The folder identifies what changes are included in the build The Build sets a Label on the Source with the same name as the Build, but the Build itself also includes the latest Changeset ID that it will be building. At the end of the Build the Build Agent identifies the new Changesets it is building by looking at the Check-ins that have occurred since the last Build. Figure: What changes have been made since the last successful Build It will then use that information to identify the Work Items that are associated with all of the Changesets Changesets are associated with Build and change the “Integrated In” field of those Work Items . Figure: Find all of the Work Items to associate with The “Integrated In” field of all of the Work Items identified by the Build Agent as being integrated into the completed Build are updated to reflect the Build number that successfully integrated that change. Figure: Now we know which Work Items were completed in a build Now that we can link a single line of code changed all the way back through the Task that initiated the action to the Requirement that started the whole thing and back down to the Build that contains the finished Requirement. But how do we know wither that Requirement has been fully tested or even meets the original Requirements? Test Cases – How we know we are done The only way we can know wither a Requirement has been completed to the required specification is to Test that Requirement. In TFS there is a Work Item type called a Test Case Test Cases enable two scenarios. The first scenario is the ability to track and validate Acceptance Criteria in the form of a Test Case. If you agree with the Business a set of goals that must be met for a Requirement to be accepted by them it makes it both difficult for them to reject a Requirement when it passes all of the tests, but also provides a level of tractability and validation for audit that a feature has been built and tested to order. Figure: You can have many Acceptance Criteria for a single Requirement It is crucial for this to work that someone from the Business has to sign-off on the Test Case moving from the  “Design” to “Ready” states. The Second is the ability to associate an MS Test test with the Test Case thereby tracking the automated test. This is useful in the circumstance when you want to Track a test and the test results of a Unit Test designed to test the existence of and then re-existence of a a Bug. Figure: Associating a Test Case with an automated Test Although it is possible it may not make sense to track the execution of every Unit Test in your system, there are many Integration and Regression tests that may be automated that it would make sense to track in this way. Bug – Lets not have regressions In order to know wither a Bug in the application has been fixed and to make sure that it does not reoccur it needs to be tracked. Figure: Bugs are the centre of their own world If the fix to a Bug is big enough to require that it is broken down into Tasks then it is probably a Requirement. You can associate a check-in with a Bug and have it tracked against a Build. You would also have one or more Test Cases to prove the fix for the Bug. Figure: Bugs have many associations This allows you to track Bugs / Defects in your system effectively and report on them. Change Request – I am not a feature In the CMMI Process template Change Requests can also be easily tracked through the system. In some cases it can be very important to track Change Requests separately as an Auditor may want to know what was changed and who authorised it. Again and similar to Bugs, if the Change Request is big enough that it would require to be broken down into Tasks it is in reality a new feature and should be tracked as a Requirement. Figure: Make sure your Change Requests only Affect Requirements and not rewrite them Conclusion Visual Studio 2010 and Team Foundation Server together provide an exceptional Application Lifecycle Management platform that can help your team comply with even the harshest of Compliance requirements while still enabling them to be Agile. Most Audits are heavy on required documentation but most of that information is captured for you as long a you do it right. You don’t even need every team member to understand it all as each of the Artifacts are relevant to a different type of team member. Business Analysts manage Requirements and Change Requests Programmers manage Tasks and check-in against Change Requests and Bugs Testers manage Bugs and Test Cases Build Masters manage Builds Although there is some crossover there are still rolls or “hats” that are worn. Do you thing this is all achievable? Have I missed anything that you think should be there?

    Read the article

  • How to manage a developer who has poor communication skills

    - by djcredo
    I manage a small team of developers on an application which is in the mid-point of its lifecycle, within a big firm. This unfortunately means there is commonly a 30/70 split of Programming tasks to "other technical work". This work includes: Working with DBA / Unix / Network / Loadbalancer teams on various tasks Placing & managing orders for hardware or infrastructure in different regions Running tests that have not yet been migrated to CI Analysis Support / Investigation Its fair to say that the Developers would all prefer to be coding, rather than doing these more mundane tasks, so I try to hand out the fun programming jobs evenly amongst the team. Most of the team was hired because, though they may not have the elite programming skills to write their own compiler / game engine / high-frequency trading system etc., they are good communicators who "can get stuff done", work with other teams, and somewhat navigate the complex beaurocracy here. They are good developers, but they are also good all-round technical staff. However, one member of the team probably has above-average coding skills, but below-average communication skills. Traditionally, the previous Development Manager tended to give him the Programming tasks and not the more mundane tasks listed above. However, I don't feel that this is fair to the rest of the team, who have shown an aptitute for developing a well-rounded skillset that is commonly required in a big-business IT department. What should I do in this situation? If I continue to give him more programming work, I know that it will be done faster (and conversly, I would expect him to complete the other work slower). But it goes against my principles, and promotes the idea that you can carve out a "comfortable niche" for yourself simply by being bad at the tasks you don't like.

    Read the article

< Previous Page | 393 394 395 396 397 398 399 400 401 402 403 404  | Next Page >