Search Results

Search found 7875 results on 315 pages for 'wired networking'.

Page 299/315 | < Previous Page | 295 296 297 298 299 300 301 302 303 304 305 306  | Next Page >

  • HOSTS ignored when disconnected [closed]

    - by Synetech
    Problem I’m seeing a strange and extremely frustrating problem. Any system that is not connect to the Internet (Windows 7 shows the no Internet access icon because it cannot constantly ping Microsoft’s servers) cannot even access locally hosted servers. Hypothesis The problem appears to be that the HOSTS file is not being used to resolve DNS entries when there are no active NICs. Tests / Reproduction You can reproduce it as so: Disconnect a system from the Internet (make sure all wired and wireless connections are disconnected). If necessary, add an entry to the HOSTS file (e.g., 127.0.0.1 foobar or 127.0.0.1 foobar.com) Open a command-prompt Type ping foobar or ping foobar.com Observations The screenshots below show a clear and demonstrative example. In the first snap, a laptop is connected to a router wirelessly. The HOSTS file has only three entries and they resolve just fine. In the second snap, the wireless radio is turned off, so the entries in the HOSTS file are ignored. Moreover, notice that pinging localhost still works even without any active NICs (as does 127.0.0.1), but it is using the IPv6 address (must be hard-coded). You can see the same results in Windows XP with no IPv6 installed, so it has nothing to do with IPv6. I tried pining what should have resolved to 127.0.0.1 while the desktop system (with no wireless NICs) was connected via its Ethernet adapter, then again after pulling the cable from the router and waiting a couple of seconds, then again after plugging the cable back in. The same thing happens if instead of pulling out the cable, the NIC is disabled through software (the [Disable] button in the NIC’s Status dialog or via Device Manager). Conclusions It looks as though the HOSTS file is only being read and used if there is an active NIC, otherwise it is being ignored. This makes some sense in that if there are no active network adapters, then presumably there will not be any network activity, and thus no need to resolve host names via the HOSTS file. This assumption is specious however because it precludes locally hosted virtual servers. The HOSTS file should be used regardless of external DNS server connectivity, otherwise you cannot use simple/consistent/testing-production names for locally hosted servers when not connected to the Internet (for example web servers; help servers for Visual Studio, 3dsmax, etc.; and so on). Question Does anyone know how to force Windows to use the HOSTS file even if there are no active NICs? Appendix Figure 1: While the wireless NIC is connected to the router (the cable-modem is in standby, so no external Internet connectivity). Figure 2: With the wireless radio turned off (the Ethernet port is not unconnected in both cases). Figure 3: Same results in XP with no IPv6

    Read the article

  • Windows Setup could not configure Windows to run on this computer's hardware

    - by Hello71
    The whole installation goes smoothly up to the point of "Completing installation ...". The monitor changes resolution, after which a standard dialog box pops up saying Windows Setup could not configure Windows to run on this computer's hardware Then, in a few seconds, the whole machine powers down. Trying to restart produces the message: STOP: c000021a {Fatal System Error} 0x00000000 (0xc0000001 0x00100448) OR it boots into Setup and comes up with the message: Windows Setup encountered an unexpected error... (This is not the actual error, just paraphrasing) I tried using the OEM restore instead of a regular install, but it fails with the same error. (Even though it worked before...) General specs: HP Pavilion Elite e9262f Intel Core i5-750 Processor ATI Radeon HD 4650 Hitachi HDT721010SLA360 ATA Device 6GB DDR3 RAM SuperMulti DVD Burner with LightScribe Some built-in Wi-Fi module http://h10025.www1.hp.com/ewfrf/wc/document?docname=c01916917 I've tried disconnecting the wireless card and disabling the built-in Ethernet and Firewire via the BIOS, and replacing the wireless keyboard and mouse with wired USB ones. Didn't work. I've also tried changing the SATA controller settings in the BIOS to RAID, AHCI, and IDE, reinstalling each time I changed. Still not working. I think the reason why it is showing the Fatal System Error is because it didn't finish installing before it errored out and shut down, so the system is left in an inconsistent state. I've tried 3 different copies (including the OEM restore) of Windows 7 now, and they're all failing at the same point, with the same error message. I've tried to install Windows 7 maybe 10 times already, with the exact same error message at the exact same location. Hm... Interestingly, the 32-bit version of Windows 7 works, but the 64-bit version doesn't. Perhaps it was a badly burned disk? Reburning the 64-bit version still comes up with the same error. Here's a picture of the side of the case that clearly says it came with Windows 7 64-bit, along with the model number and CPU. sudo fdisk -l: Disk /dev/sda: 1000.2 GB, 1000204886016 bytes 255 heads, 63 sectors/track, 121601 cylinders Units = cylinders of 16065 * 512 = 8225280 bytes Sector size (logical/physical): 512 bytes / 512 bytes I/O size (minimum/optimal): 512 bytes / 512 bytes Disk identifier: 0x0009896f Device Boot Start End Blocks Id System /dev/sda1 * 1 13 104391 7 HPFS/NTFS /dev/sda2 14 94119 755906445 7 HPFS/NTFS /dev/sda3 119922 121602 13492224 7 HPFS/NTFS /dev/sda4 94120 119922 207257740+ 5 Extended /dev/sda5 119527 119922 3170769 82 Linux swap / Solaris /dev/sda6 107174 119526 99225441 83 Linux /dev/sda7 94120 107173 104856192 7 HPFS/NTFS Partition table entries are not in disk order

    Read the article

  • Ubuntu: Multiple NICs, one used only for Wake-On-LAN

    - by jcwx86
    This is similar to some other questions, but I have a specific need which is not covered in the other questions. I have an Ubuntu server (11.10) with two NICs. One is built into the motherboard and the other is a PCI express card. I want to have my server connected to the internet via my NAT router and also have it able to wake from suspend using a Magic Packet (henceforth referred to as Wake-On-LAN, WOL). I can't do this with just one of the NICs because each has an issue - the built-in NIC will crash the system if it is placed under heavy load (typically downloading data), whilst the PCI express NIC will crash the system if it is used for WOL. I have spent some time investigating these individual problems, to no avail. My plan is thus: use the built-in NIC solely for WOL, and use the PCI express card for all other network communication except WOL. Since I send the WOL Magic Packet to a specific MAC address, there is no danger of hitting the wrong NIC, but there is a danger of using the built-in NIC for general network access, overloading it and crashing the system. Both NICs are wired to the same LAN with address space 192.168.0.0/24. The built-in ethernet card is set to have interface name eth1 and the PCI express card is eth0 in Ubuntu's udev persistent rules (so they stay the same upon reboot). I have been trying to set this up with the /etc/network/interfaces file. Here is where I am currently: auto lo iface lo inet loopback auto eth0 iface eth0 inet static address 192.168.0.3 netmask 255.255.255.0 network 192.168.0.0 broadcast 192.168.0.255 gateway 192.168.0.1 auto eth1 iface eth1 inet static address 192.168.0.254 netmask 255.255.255.0 I think by not specifying a gateway for eth1, I prevent it being used for outgoing requests. I don't mind if it can be reached on 192.168.0.254 on the LAN, i.e. via SSH -- it's IP is irrelevant to WOL, which is based on MAC addresses -- I just don't want it to be used to access internet resources. My kernel routing table (from route -n) is Kernel IP routing table Destination Gateway Genmask Flags Metric Ref Use Iface 0.0.0.0 192.168.0.1 0.0.0.0 UG 100 0 0 eth0 169.254.0.0 0.0.0.0 255.255.0.0 U 1000 0 0 eth0 192.168.0.0 0.0.0.0 255.255.255.0 U 0 0 0 eth0 192.168.0.0 0.0.0.0 255.255.255.0 U 0 0 0 eth1 My question is this: Is this sufficient for what I want to achieve? My research has thrown up the idea of using static routing to specify that eth1 should only be used for WOL on the local network, but I'm not sure this is necessary. I have been monitoring the activity of the interfaces using iptraf and it seems like eth0 takes the vast majority of the packets, though I am not sure that this will be consistent based on my configuration. Given that if I mess up the configuration, my system will likely crash, it is important to me to have this set up correctly!

    Read the article

  • I am unable to connect to my netbook from any machine on my network until the netbook has pinged it

    - by Samuel Husky
    I have a rather strange issue with my netbook on my local network. When trying to connect to it in any way from a remote system it does not appear to find it. However if I get the netbook to ping the machine trying to connect it mystically appears to work. Below is the ping test from my main PC to the netbook. C:\Users\Sam>ping 192.168.8.102 Pinging 192.168.8.102 with 32 bytes of data: Reply from 192.168.8.100: Destination host unreachable. Reply from 192.168.8.100: Destination host unreachable. Reply from 192.168.8.100: Destination host unreachable. Reply from 192.168.8.100: Destination host unreachable. Ping statistics for 192.168.8.102: Packets: Sent = 4, Received = 4, Lost = 0 (0% loss), Now a ping from the netbook to my main PC sam@malamute ~ $ ping 192.168.8.100 PING 192.168.8.100 (192.168.8.100) 56(84) bytes of data. 64 bytes from 192.168.8.100: icmp_req=1 ttl=128 time=2.46 ms 64 bytes from 192.168.8.100: icmp_req=2 ttl=128 time=0.835 ms 64 bytes from 192.168.8.100: icmp_req=3 ttl=128 time=1.60 ms 64 bytes from 192.168.8.100: icmp_req=4 ttl=128 time=1.32 ms 64 bytes from 192.168.8.100: icmp_req=5 ttl=128 time=1.34 ms ^C --- 192.168.8.100 ping statistics --- 5 packets transmitted, 5 received, 0% packet loss, time 4004ms rtt min/avg/max/mdev = 0.835/1.514/2.460/0.536 ms And the same ping again from the main PC after the netbook has made a connection to it C:\Users\Sam>ping 192.168.8.102 Pinging 192.168.8.102 with 32 bytes of data: Reply from 192.168.8.102: bytes=32 time=1ms TTL=64 Reply from 192.168.8.102: bytes=32 time=1ms TTL=64 Reply from 192.168.8.102: bytes=32 time=1ms TTL=64 Reply from 192.168.8.102: bytes=32 time=1ms TTL=64 Ping statistics for 192.168.8.102: Packets: Sent = 4, Received = 4, Lost = 0 (0% loss), Approximate round trip times in milli-seconds: Minimum = 1ms, Maximum = 1ms, Average = 1ms The netbook is running Gentoo and is currently connected via wireless. My main PC is running Windows 7 however I get the same result no matter what PC I use on this network. Please see this example from a CentOS machine on the same network [root@tiger ~]# ping 192.168.8.102 PING 192.168.8.102 (192.168.8.102) 56(84) bytes of data. From 192.168.8.200 icmp_seq=2 Destination Host Unreachable From 192.168.8.200 icmp_seq=3 Destination Host Unreachable From 192.168.8.200 icmp_seq=4 Destination Host Unreachable --- 192.168.8.102 ping statistics --- 6 packets transmitted, 0 received, +3 errors, 100% packet loss, time 5000ms , pipe 3 If you need any more information or require logs or config files please let me know and any assistance is greatly appreciated. Additional info: No responses on TCP dump from the netbook. Same result when booting into Ubuntu from a USB key. No issue when using a wired Ethernet connection.

    Read the article

  • DELL DRAC & Ubuntu VPN Connection

    - by Mikunos
    I am trying to connect to a DELL DRAC card without success by Ubuntu VPN Connection Manager. I have these data: Protocol: PPTP SERVER IP PPTP: 1233.123.123.123 DRAC IP: 192.168.10.25 Subnet: 255.255.0.0 User: myuser Pass: mypass where have I to write these parameters? I have configured the PPTP connection using the graphical tool in Ubuntu 11.10 ... but in the /var/log/syslog I get these messages: Apr 15 11:33:15 shinet NetworkManager[1035]: <info> Starting VPN service 'pptp'... Apr 15 11:33:15 shinet NetworkManager[1035]: <info> VPN service 'pptp' started (org.freedesktop.NetworkManager.pptp), PID 18180 Apr 15 11:33:15 shinet NetworkManager[1035]: <info> VPN service 'pptp' appeared; activating connections Apr 15 11:33:15 shinet NetworkManager[1035]: <info> VPN plugin state changed: 3 Apr 15 11:33:15 shinet NetworkManager[1035]: <info> VPN connection 'Connessione VPN 1' (Connect) reply received. Apr 15 11:33:15 shinet pppd[18182]: Plugin /usr/lib/pppd/2.4.5/nm-pptp-pppd-plugin.so loaded. Apr 15 11:33:15 shinet pppd[18182]: pppd 2.4.5 started by root, uid 0 Apr 15 11:33:15 shinet pppd[18182]: Using interface ppp0 Apr 15 11:33:15 shinet pppd[18182]: Connect: ppp0 <--> /dev/pts/1 Apr 15 11:33:15 shinet NetworkManager[1035]: SCPlugin-Ifupdown: devices added (path: /sys/devices/virtual/net/ppp0, iface: ppp0) Apr 15 11:33:15 shinet NetworkManager[1035]: SCPlugin-Ifupdown: device added (path: /sys/devices/virtual/net/ppp0, iface: ppp0): no ifupdown configuration found. Apr 15 11:33:15 shinet pptp[18185]: nm-pptp-service-18180 log[main:pptp.c:314]: The synchronous pptp option is NOT activated Apr 15 11:33:46 shinet pppd[18182]: LCP: timeout sending Config-Requests Apr 15 11:33:46 shinet pppd[18182]: Connection terminated. Apr 15 11:33:46 shinet avahi-daemon[1081]: Withdrawing workstation service for ppp0. Apr 15 11:33:46 shinet NetworkManager[1035]: SCPlugin-Ifupdown: devices removed (path: /sys/devices/virtual/net/ppp0, iface: ppp0) Apr 15 11:33:46 shinet NetworkManager[1035]: <warn> VPN plugin failed: 1 Apr 15 11:33:46 shinet pppd[18182]: Modem hangup Apr 15 11:33:46 shinet NetworkManager[1035]: <warn> VPN plugin failed: 1 Apr 15 11:33:51 shinet pppd[18182]: Exit. Apr 15 11:33:51 shinet NetworkManager[1035]: <warn> VPN plugin failed: 1 Apr 15 11:33:51 shinet NetworkManager[1035]: <info> VPN plugin state changed: 6 Apr 15 11:33:51 shinet NetworkManager[1035]: <info> VPN plugin state change reason: 0 Apr 15 11:33:51 shinet NetworkManager[1035]: <warn> error disconnecting VPN: Could not process the request because no VPN connection was active. Apr 15 11:33:51 shinet NetworkManager[1035]: <info> Policy set 'Wired connection 1' (eth0) as default for IPv4 routing and DNS. Apr 15 11:33:57 shinet NetworkManager[1035]: <info> VPN service 'pptp' disappeared Thanks

    Read the article

  • Gigabyte GA-Z77X-D3H MB problems

    - by Hans
    I installed a new system last week. I've some issues with it. The system consists of a: Gigabyte GA-Z77X-D3H with F9 BIOS (latest) Intel Core i5 3570K proccesor Sapphire Radeon HD7850 2x 8GB Corsair 1600MHz memory OCZ Vertex 2 120G SSD Connected peripherals : 2 Samsung 940BF (1 via DVI on GFX card, 1 via an Displayport to DVI adapter) 1 Dell U2312HM monitor (displayport) Dell USB Hub (monitor) Wired mouse, wireless keyboard (logitech) Logitch G25 wheel Canon MP800 printer Okay, my issues are the following: if I plug in 1 or more monitor at DisplayPort during boot, most of the time it won't boot properly. I get an empty message screen of UEFI: only the header GIGABYTE DUEL BIOS appears. The system reboots itself, turns on for a few seconds (no video) and then reboots again. Now it starts all over again. This repeats until I remove all displayport monitors. Windows boots, and I can use them when I replug them. The graphics card has been running fine for a few weeks on an older system (intel Q6600). Another issue is; if I plug in my G25 steering wheel, the UEFI BIOS is inaccessible. It either gives the same empty UEFI screen, or the BIOS screen is rendering but crashes half way (so pieces of text and graphics are missing, and it has crashed). If I remove the G25, all is fine. To verify the graphics card is OK and the motherboard is causing these issues, I tried an NVIDIA 8800GT graphics card. This hasn't got Displayport, but it also cannot boot the BIOS with the G25 wheel plugged in. The PC also refuses to go into or out of standby. It just hangs when going into standby, and in other occasions (when it does succesfully do so) get out of standby. Power supply is OCZ StealthXStream 600W. Proccesor is 25 - 30C idle, ~55C stressed (Scythe Mugen 2). I am really puzzled what can be done to resolve this. I am not really waiting for an RMA request (otherwise I will return the MB for another type), because it will likely mean I have to wait very long before I get a replacement. Anyone else with a similar experience on this board/chipset or can help me troubleshoot this?

    Read the article

  • Reliable access to Internet but not local network (not DNS or proxy issues)

    - by Ian Goldby
    I'm looking for help with a Vista Home Premium laptop that has trouble accessing any resource on our home network, but accesses the Internet just fine. The set-up is this: The Vista laptop and a MacBook Pro connect wirelessly to the router-modem. A Synology DS212j NAS drive has a wired connection to the router-modem. Devices on the local network are always referred to by IP address, so this cannot be a DNS issue. The MacBook Pro connects reliably to the NA via AFP (network shared folders), SMB (network shared folders) and HTTP. The Vista laptop connects to and browses sites on the Internet without any problems. It can log into the NAS via SMB and list the shared folders (so there is nothing wrong with the log-in credentials), but when it tries to open any of the folders Explorer just hangs with the spinning cursor for several minutes and then says "\192.168.1.64\shared\Photos is not accessible. You might not have permission to use this network resource. Contact the administrator of this server to find out if you have access permissions. The specified network name is no longer available." It can ping the NAS successfully. If I try to open the NAS drive's web interface, the browser just hangs. This is the same with IE, Firefox and Chrome. (There is no proxy.) I can log into the NAS drive with FTP and navigate directories, but when I try to list the contents of a directory with more than a handful of entries, the ftp client hangs. I set up a website on the MacBook. The Vista laptop was able to load some of the pages, but loading any of the images was very hit and miss. Images embedded in HTML pages never worked no matter how many times I reloaded the page, but when I linked directly to the image it did load (though several attempts were sometimes needed). I tried all of this with the Windows Firewall turned off, and with AVG turned off. That made no difference. I'd really appreciate any suggestions anyone can make. The fact that the Vista laptop has trouble with HTTP and FTP as well as SMB connections suggests to me that this is a problem at the TCP level or below. But don't forget it accesses sites outside the LAN with no problems.

    Read the article

  • Wrapping a point-to-point link

    - by user3712955
    I'm using a pair of IP radios (non-WiFi) to bridge my office engineering LAN (172.0.0.0/8) to a lab in another building. The radios work fine, but they expose a web management interface I'd like to hide, and they also generate traffic (ARP, STP, and more) that I need to keep off my (very, very clean) LAN segments. I have some ARM-Linux boards (similar to Beagle/Panda/RasPi) running Ubuntu, and I've put one at each end of the link, between the radio and the LAN. Each of the boards has 2 wired Ethernet interfaces, eth0 and eth1. The LAN segments are connected to eth0, and the radios are connected to eth1. I'd like to accomplish the following: Keep radio-originated traffic off my LAN segments! Hide all services provided by the radio (web, ssh, etc.) Transparently pass all traffic between the LAN segments (including things like ARP). The above also applies to the ARM-Linux boards: No stray traffic my LAN from them either! I'd like the system to look like a switch: LAN packets arriving at one eth0 appear at the other. And neither eth0 should have an IP address: The working system should behave like a CAT6 cable with some latency (instead of ARM boards and radios). Unfortunately, I'm confused about how to properly configure the ARM Ubuntu systems. What I'm guessing I need is a bridge on each board (br0?) and a VLAN (vlan0 or eth0.0?) to isolate the LAN traffic from everything else as it passes through the ARM boards and the radios. Then I need some kind of a firewall to block sending anything out eth0 that isn't from the other eth0 (via the VLAN). I've looked at the ip and ebtables commands (especially -t broute). While the concepts sorta-kinda make sense, I'm completely lost in the details. Edit: In the perverse case that a system on one of my LAN segments has the same IP address as one of the radios, or as eth1 on the ARM-Ubuntu boards, a VLAN won't work. Which I believe means I need to tunnel all traffic between the two eth0 interfaces to get that "like a wire" behavior. Help? Finally, I'd like to have a way to temporarily expose services on the ARM boards (ssh) and the radios (web) for maintenance purposes. Ideally, it would expose an IP address with ssh available on port 22. Once connected, I figure I'd start an X11 session and run a browser on the ARM board to access the radios. Or something. I could login via the console to enable/disable this, or perhaps could use a GPIO to trigger a script. I feel I've identified most of the pieces needed to make all this happen, but I have no idea how to combine them to make a working system. Thanks!

    Read the article

  • SQLAuthority News – Speaking Sessions at TechEd India – 3 Sessions – 1 Panel Discussion

    - by pinaldave
    Microsoft Tech-Ed India 2010 is considered as the major Technology event of the year for various IT professionals and developers. This event will feature a comprehensive forum in order   to learn, connect, explore, and evolve the current technologies we have today. I would recommend this event to you since here you will learn about today’s cutting-edge trends, thereby enhancing your work profile and getting ahead of the rest. But, the most important benefit of all might be the networking opportunity that that you can attain by attending the forum. You can build personal connections with various Microsoft experts and peers that will last even far beyond this event! It also feels good to let you know that I will be speaking at this year’s event! So, here are the sessions that await you in this mega-forum. Session 1: True Lies of SQL Server – SQL Myth Buster Date: April 12, 2010  Time: 11:15pm – 11:45pm In this 30-minute demo session, I am going to briefly demonstrate few SQL Server Myth and their resolution backing up with some demo. This demo session is a must-attend for all developers and administrators who would come to the event. This is going to be a very quick yet  fun session. Session 2: Master Data Services in Microsoft SQL Server 2008 R2 Date: April 12, 2010  Time: 2:30pm-3:30pm SQL Server Master Data Services will ship with SQL Server 2008 R2 and will improve Microsoft’s platform appeal. This session provides an in depth demonstration of MDS features and highlights important usage scenarios. Master Data Services enables consistent decision making by allowing you to create, manage and propagate changes from single master view of your business entities. Also with MDS – Master Data-hub which is the vital component helps ensure reporting consistency across systems and deliver faster more accurate results across the enterprise. We will talk about establishing the basis for a centralized approach to defining, deploying, and managing master data in the enterprise. Session 3: Developing with SQL Server Spatial and Deep Dive into Spatial Indexing Date: April 14, 2010 Time: 5:00pm-6:00pm Microsoft SQL Server 2008 delivers new spatial data types that enable you to consume, use, and extend location-based data through spatial-enabled applications. Attend this session to learn how to use spatial functionality in next version of SQL Server to build and optimize spatial queries. This session outlines the new geography data type to store geodetic spatial data and perform operations on it, use the new geometry data type to store planar spatial data and perform operations on it, take advantage of new spatial indexes for high performance queries, use the new spatial results tab to quickly and easily view spatial query results directly from within Management Studio, extend spatial data capabilities by building or integrating location-enabled applications through support for spatial standards and specifications and much more. Panel Discussion: Harness the power of Web – SEO and Technical Blogging Date: April 12, 2010 Time: 5:00pm-6:00pm Here you will learn lots of tricks and tips about SEO and Technical Blogging from various Industry Technical Blogging Experts. This event will surely be one of the most important Tech conventions of 2010. TechEd is going to be a very busy time for Tech developers and enthusiasts, since every evening there will be a fun session to attend. If you are interested in any of the above topics for every session, I suggest that you visit each of them as you will learn so many things about the topic to be discussed. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: MVP, Pinal Dave, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority Author Visit, SQLAuthority News, T SQL, Technology Tagged: TechEd, TechEdIn

    Read the article

  • C# 4.0: Dynamic Programming

    - by Paulo Morgado
    The major feature of C# 4.0 is dynamic programming. Not just dynamic typing, but dynamic in broader sense, which means talking to anything that is not statically typed to be a .NET object. Dynamic Language Runtime The Dynamic Language Runtime (DLR) is piece of technology that unifies dynamic programming on the .NET platform, the same way the Common Language Runtime (CLR) has been a common platform for statically typed languages. The CLR always had dynamic capabilities. You could always use reflection, but its main goal was never to be a dynamic programming environment and there were some features missing. The DLR is built on top of the CLR and adds those missing features to the .NET platform. The Dynamic Language Runtime is the core infrastructure that consists of: Expression Trees The same expression trees used in LINQ, now improved to support statements. Dynamic Dispatch Dispatches invocations to the appropriate binder. Call Site Caching For improved efficiency. Dynamic languages and languages with dynamic capabilities are built on top of the DLR. IronPython and IronRuby were already built on top of the DLR, and now, the support for using the DLR is being added to C# and Visual Basic. Other languages built on top of the CLR are expected to also use the DLR in the future. Underneath the DLR there are binders that talk to a variety of different technologies: .NET Binder Allows to talk to .NET objects. JavaScript Binder Allows to talk to JavaScript in SilverLight. IronPython Binder Allows to talk to IronPython. IronRuby Binder Allows to talk to IronRuby. COM Binder Allows to talk to COM. Whit all these binders it is possible to have a single programming experience to talk to all these environments that are not statically typed .NET objects. The dynamic Static Type Let’s take this traditional statically typed code: Calculator calculator = GetCalculator(); int sum = calculator.Sum(10, 20); Because the variable that receives the return value of the GetCalulator method is statically typed to be of type Calculator and, because the Calculator type has an Add method that receives two integers and returns an integer, it is possible to call that Sum method and assign its return value to a variable statically typed as integer. Now lets suppose the calculator was not a statically typed .NET class, but, instead, a COM object or some .NET code we don’t know he type of. All of the sudden it gets very painful to call the Add method: object calculator = GetCalculator(); Type calculatorType = calculator.GetType(); object res = calculatorType.InvokeMember("Add", BindingFlags.InvokeMethod, null, calculator, new object[] { 10, 20 }); int sum = Convert.ToInt32(res); And what if the calculator was a JavaScript object? ScriptObject calculator = GetCalculator(); object res = calculator.Invoke("Add", 10, 20); int sum = Convert.ToInt32(res); For each dynamic domain we have a different programming experience and that makes it very hard to unify the code. With C# 4.0 it becomes possible to write code this way: dynamic calculator = GetCalculator(); int sum = calculator.Add(10, 20); You simply declare a variable who’s static type is dynamic. dynamic is a pseudo-keyword (like var) that indicates to the compiler that operations on the calculator object will be done dynamically. The way you should look at dynamic is that it’s just like object (System.Object) with dynamic semantics associated. Anything can be assigned to a dynamic. dynamic x = 1; dynamic y = "Hello"; dynamic z = new List<int> { 1, 2, 3 }; At run-time, all object will have a type. In the above example x is of type System.Int32. When one or more operands in an operation are typed dynamic, member selection is deferred to run-time instead of compile-time. Then the run-time type is substituted in all variables and normal overload resolution is done, just like it would happen at compile-time. The result of any dynamic operation is always dynamic and, when a dynamic object is assigned to something else, a dynamic conversion will occur. Code Resolution Method double x = 1.75; double y = Math.Abs(x); compile-time double Abs(double x) dynamic x = 1.75; dynamic y = Math.Abs(x); run-time double Abs(double x) dynamic x = 2; dynamic y = Math.Abs(x); run-time int Abs(int x) The above code will always be strongly typed. The difference is that, in the first case the method resolution is done at compile-time, and the others it’s done ate run-time. IDynamicMetaObjectObject The DLR is pre-wired to know .NET objects, COM objects and so forth but any dynamic language can implement their own objects or you can implement your own objects in C# through the implementation of the IDynamicMetaObjectProvider interface. When an object implements IDynamicMetaObjectProvider, it can participate in the resolution of how method calls and property access is done. The .NET Framework already provides two implementations of IDynamicMetaObjectProvider: DynamicObject : IDynamicMetaObjectProvider The DynamicObject class enables you to define which operations can be performed on dynamic objects and how to perform those operations. For example, you can define what happens when you try to get or set an object property, call a method, or perform standard mathematical operations such as addition and multiplication. ExpandoObject : IDynamicMetaObjectProvider The ExpandoObject class enables you to add and delete members of its instances at run time and also to set and get values of these members. This class supports dynamic binding, which enables you to use standard syntax like sampleObject.sampleMember, instead of more complex syntax like sampleObject.GetAttribute("sampleMember").

    Read the article

  • Oracle Announces Oracle Exadata X3 Database In-Memory Machine

    - by jgelhaus
    Fourth Generation Exadata X3 Systems are Ideal for High-End OLTP, Large Data Warehouses, and Database Clouds; Eighth-Rack Configuration Offers New Low-Cost Entry Point ORACLE OPENWORLD, SAN FRANCISCO – October 1, 2012 News Facts During his opening keynote address at Oracle OpenWorld, Oracle CEO, Larry Ellison announced the Oracle Exadata X3 Database In-Memory Machine - the latest generation of its Oracle Exadata Database Machines. The Oracle Exadata X3 Database In-Memory Machine is a key component of the Oracle Cloud. Oracle Exadata X3-2 Database In-Memory Machine and Oracle Exadata X3-8 Database In-Memory Machine can store up to hundreds of Terabytes of compressed user data in Flash and RAM memory, virtually eliminating the performance overhead of reads and writes to slow disk drives, making Exadata X3 systems the ideal database platforms for the varied and unpredictable workloads of cloud computing. In order to realize the highest performance at the lowest cost, the Oracle Exadata X3 Database In-Memory Machine implements a mass memory hierarchy that automatically moves all active data into Flash and RAM memory, while keeping less active data on low-cost disks. With a new Eighth-Rack configuration, the Oracle Exadata X3-2 Database In-Memory Machine delivers a cost-effective entry point for smaller workloads, testing, development and disaster recovery systems, and is a fully redundant system that can be used with mission critical applications. Next-Generation Technologies Deliver Dramatic Performance Improvements Oracle Exadata X3 Database In-Memory Machines use a combination of scale-out servers and storage, InfiniBand networking, smart storage, PCI Flash, smart memory caching, and Hybrid Columnar Compression to deliver extreme performance and availability for all Oracle Database Workloads. Oracle Exadata X3 Database In-Memory Machine systems leverage next-generation technologies to deliver significant performance enhancements, including: Four times the Flash memory capacity of the previous generation; with up to 40 percent faster response times and 100 GB/second data scan rates. Combined with Exadata’s unique Hybrid Columnar Compression capabilities, hundreds of Terabytes of user data can now be managed entirely within Flash; 20 times more capacity for database writes through updated Exadata Smart Flash Cache software. The new Exadata Smart Flash Cache software also runs on previous generation Exadata systems, increasing their capacity for writes tenfold; 33 percent more database CPU cores in the Oracle Exadata X3-2 Database In-Memory Machine, using the latest 8-core Intel® Xeon E5-2600 series of processors; Expanded 10Gb Ethernet connectivity to the data center in the Oracle Exadata X3-2 provides 40 10Gb network ports per rack for connecting users and moving data; Up to 30 percent reduction in power and cooling. Configured for Your Business, Available Today Oracle Exadata X3-2 Database In-Memory Machine systems are available in a Full-Rack, Half-Rack, Quarter-Rack, and the new low-cost Eighth-Rack configuration to satisfy the widest range of applications. Oracle Exadata X3-8 Database In-Memory Machine systems are available in a Full-Rack configuration, and both X3 systems enable multi-rack configurations for virtually unlimited scalability. Oracle Exadata X3-2 and X3-8 Database In-Memory Machines are fully compatible with prior Exadata generations and existing systems can also be upgraded with Oracle Exadata X3-2 servers. Oracle Exadata X3 Database In-Memory Machine systems can be used immediately with any application certified with Oracle Database 11g R2 and Oracle Real Application Clusters, including SAP, Oracle Fusion Applications, Oracle’s PeopleSoft, Oracle’s Siebel CRM, the Oracle E-Business Suite, and thousands of other applications. Supporting Quotes “Forward-looking enterprises are moving towards Cloud Computing architectures,” said Andrew Mendelsohn, senior vice president, Oracle Database Server Technologies. “Oracle Exadata’s unique ability to run any database application on a fully scale-out architecture using a combination of massive memory for extreme performance and low-cost disk for high capacity delivers the ideal solution for Cloud-based database deployments today.” Supporting Resources Oracle Press Release Oracle Exadata Database Machine Oracle Exadata X3-2 Database In-Memory Machine Oracle Exadata X3-8 Database In-Memory Machine Oracle Database 11g Follow Oracle Database via Blog, Facebook and Twitter Oracle OpenWorld 2012 Oracle OpenWorld 2012 Keynotes Like Oracle OpenWorld on Facebook Follow Oracle OpenWorld on Twitter Oracle OpenWorld Blog Oracle OpenWorld on LinkedIn Mark Hurd's keynote with Andy Mendelsohn and Juan Loaiza - - watch for the replay to be available soon at http://www.youtube.com/user/Oracle or http://www.oracle.com/openworld/live/on-demand/index.html

    Read the article

  • Failling install Ralink RT5592 driver on Ubuntu 14.04 LTS

    - by atisou
    My problem concerns the installation of a wi-fi driver (RT5592) for my new wi-fi adapter (PCE-N53) on my newly built computer. Basically, I don't manage to get the driver installed and therefore I cannot get the wifi to work. I know I am not the only one having this issue this year, between RT5592 driver and Ubuntu 14.04 LTS, in one way or the other. Is there anybody who has ever been able to fix this problem? It does not look like on all the posts I have been through... Following an answer to a same problem as mine (I was getting the same error message as Christopher Kyle Horton of "incompatible types" etc), I have applied the instructions and done the editings in a script as suggested by Paul B. Unfortunately I still do get error/warnings message (a different one this time) at the end of the make and the wi-fi still does not work. Below is a snapshot of the end of the message: In file included from /home/username/Downloads/PCE-N53/Linux/DPO_GPL_RT5592STA_LinuxSTA_v2.6.0.0_20120326/include/os/rt_linux.h:31:0, from /home/username/Downloads/PCE-N53/Linux/DPO_GPL_RT5592STA_LinuxSTA_v2.6.0.0_20120326/include/rtmp_os.h:44, from /home/username/Downloads/PCE-N53/Linux/DPO_GPL_RT5592STA_LinuxSTA_v2.6.0.0_20120326/include/rtmp_comm.h:69, from /home/username/Downloads/PCE-N53/Linux/DPO_GPL_RT5592STA_LinuxSTA_v2.6.0.0_20120326/os/linux/../../os/linux/pci_main_dev.c:31: include/linux/module.h:88:32: error: ‘__mod_pci_device_table’ aliased to undefined symbol ‘rt2860_pci_tbl’ extern const struct gtype##_id __mod_##gtype##_table \ ^ include/linux/module.h:146:3: note: in expansion of macro ‘MODULE_GENERIC_TABLE’ MODULE_GENERIC_TABLE(type##_device,name) ^ /home/username/Downloads/PCE-N53/Linux/DPO_GPL_RT5592STA_LinuxSTA_v2.6.0.0_20120326/os/linux/../../os/linux/pci_main_dev.c:73:1: note: in expansion of macro ‘MODULE_DEVICE_TABLE’ MODULE_DEVICE_TABLE(pci, rt2860_pci_tbl); ^ cc1: some warnings being treated as errors make[2]: *** [/home/username/Downloads/PCE-N53/Linux/DPO_GPL_RT5592STA_LinuxSTA_v2.6.0.0_20120326/os/linux/../../os/linux/pci_main_dev.o] Error 1 make[1]: *** [_module_/home/username/Downloads/PCE-N53/Linux/DPO_GPL_RT5592STA_LinuxSTA_v2.6.0.0_20120326/os/linux] Error 2 make[1]: Leaving directory `/usr/src/linux-headers-3.13.0-32-generic' make: *** [LINUX] Error 2 The full pastebin data: paste.ubuntu.com/8088834/ It looks from the message that one would need to edit manually some of/other scripts in the driver package, as did Paul B suggest in one case. But I have no idea how to do that. Here is the driver package of the wifi adapter: www.asus.com/uk/Networking/PCEN53/HelpDesk_Download/ My system is as following: OS: ubuntu 14.04 LTS wi-fi card: Asus PCE-N53 motherboard: Asus KCMA-D8 processor: AMD Opteron 4228 HE kernel: 3.13.0-32-generic Following this info from chili555 in here, below are some extra info about my system: lspci -nn | grep 0280 gives 04:00.0 Network controller [0280]: Ralink corp. RT5592 PCI2 Wireless Network Adapater [1814:5592] and sudo apt-get install linux-headers-generic returns linux-headers-generic is already the newest version. 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. If this is a kernel version (I have 3.13.0-32-generic) incompatibility issue with the driver as chilli555 suggests (the README file in the driver package says indeed it is compatible with kernel 2.6), how could one trick this around to make it work? that should be possible right? On ubuntu forums, the patches proposed dont work (leads the computer to freeze). Basically: is there anybody out there who has ever been able to make a PCE-N53 work on Ubuntu 14.04 LTS (kernel 3.13)? how shall I edit the driver package to make it work for my kernel?

    Read the article

  • SQLAuthority News – Author Visit – SQL Server 2008 R2 Launch

    - by pinaldave
    June 11, 2010 was a wonderful day because I attended the very first SQL Server 2008 R2 Launch event held by Microsoft at Mumbai. I traveled to Mumbai from my home town, Ahmedabad. The event was located at one of the best hotels in Mumbai,”The Leela”. SQL Server R2 Launch was an evening event that had a few interesting talks. SQL PASS is associated with this event as one of the partners and its goal is to increase the awareness of the Community about SQL Server. I met many interesting people and had a great networking opportunity at the event. This event was kicked off with an awesome laser show and a “Welcome” video, which was followed by a Microsoft Executive session wherein there were several interesting demo. The very first demo was about Powerpivot. I knew beforehand that there will be Powerpivot demos because it is a very popular subject; however, I was really hoping to see other interesting demos from SQL Server 2008 R2. And believe me; I was happier to see the later demos. There were demos from SQL Server Utility Control Point, as well an integration of Bing Map with Reporting Servers. I really enjoyed the interactive and informative session by Shivaram Venkatesh. He had excellent presentation skills as well as ample technical knowledge to keep the audience attentive. I really liked his presentations skills wherein he did not read the whole slide deck; rather, he picked one point and using that point he told the story of the whole slide deck. I also enjoyed my conversation with Afaq Choonawala, who is one of the “gem guys” in Microsoft. I also want to acknowledge Ashwin Kini and Mohit Panchal for their excellent support to this event. Mumbai IT Pro is a user group which you can really count on for any kind of help. After excellent demos and a vibrant start of the event, all the audience was jazzed up. There were two vendors’ sessions right after the first session. Intel had 15 minutes to present; however, Intel’s representative, who had good knowledge of the subject, had nearly 30+ slides in his presentation, so he had to rush a bit to cover the whole slide deck. Intel presentations were followed up by another vendor presentation from NetApp. I have previously heard about this tool. After I saw the demo which did not work the first time the Net App presenter demonstrated it, I started to have a doubt on this product. I personally went to clarify my doubt to the demo booth after the presentation was over, but I realize the NetApp presenter or booth owner had absolutely a POOR KNOWLEDGE of SQL Server and even of their own NetApp product. The NetApp people tried to misguide us and when we argued, they started to say different things against what they said earlier. At one point in their presentation, they claimed their application does something very fast, which did not really happen in front of all the audience. They blamed SQL Server R2 DBCC CHECKDB command for their product’s failed demonstration. I know that NetApp has many great products; however, this one was not conveyed clearly and even created a negative impression to all of us. Well, let us not judge the potential, fun, education and enigma of the launch event through a small glitch. This event was jam-packed and extremely well-received by everybody who attended it. As what I said, average demos and good presentations by MS folks were really something to cheer about. Any launch event is considered as successful if it achieves its goal to excite users with its cutting edge technology; just like this event that left a very deep impression on me. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority Author Visit, SQLAuthority News, T SQL, Technology Tagged: PASS, SQLPASS

    Read the article

  • Travelling MVP #1: Visit to SharePoint User Group Finland

    - by DigiMortal
    My first self organized trip this autumn was visit to SharePoint User Group Finland community evening. As active community leaders who make things like these possible they are worth mentioning and on spug.fi side there was Jussi Roine the one who invited me. Here is my short review about my trip to Helsinki. User group meeting As Helsinki is near Tallinn I went there using ship. It was easy to get from sea port to venue and I had also some minutes of time to visit academic book store. Community evening was held on the ground floor of one city center hotel and room was conveniently located near hotel bar and restaurant. Here is the meeting schedule: Welcome (Jussi Roine) OpenText application archiving and governance for SharePoint (Bernd Hennicke, OpenText) Using advanced C# features in SharePoint development (Alexey Sadomov, NED Consulting) Optimizing public-facing internet sites for SharePoint (Gunnar Peipman) After meeting, of course, local dudes doesn’t walk away but continue with some beers and discussion. Sessions After welcome words by Jussi there was session by Bernd Hennicke who spoke about OpenText. His session covered OpenText history and current moment. After this introduction he spoke about OpenText products for SharePoint and gave the audience good overview about where their SharePoint extensions fit in big picture. I usually don’t like those vendors sessions but this one was good. I mean vendor dudes were not aggressively selling something. They were way different – kind people who introduced their stuff and later answered questions. They acted like good guests. Second speaker was Alexey Sadomov who is working on SharePoint development projects. He introduced some ways how to get over some limitations of SharePoint. I don’t go here deeply with his session but it’s worth to mention that this session was strong one. It is not rear case when developers have to make nasty hacks to SharePoint. I mean really nasty hacks. Often these hacks are long blocks of code that uses terrible techniques to achieve the result. Alexey introduced some very much civilized ways about how to apply hacks. Alex Sadomov, SharePoint MVP, speaking about SharePoint coding tips and tricks on C# I spoke about how I optimized caching of Estonian Microsoft community portal that runs on SharePoint Server and that uses publishing infrastructure. I made no actual demos on SharePoint because I wanted to focus on optimizing process and share some experiences about how to get caches optimized and how to measure caches. Networking After official part there was time to talk and discuss with people. Finns are cool – they have beers and they are glad. It was not big community event but people were like one good family. Developers there work often for big companies and it was very interesting to me to hear about their experiences with SharePoint. One thing was a little bit surprising for me – SharePoint guys in Finland are talking actively also about Office 365 and online SharePoint. It doesn’t happen often here in Estonia. I had to leave a little bit 21:00 to get to my ship back to Tallinn. I am sure spug.fi dudes continued nice evening and they had at least same good time as I did. Do I want to go back to Finland and meet these guys again? Yes, sure, let’s do it again! :)

    Read the article

  • Best WordPress Shopping Cart & Ecommerce Plugins

    - by Edward
    A versatile WordPress Shopping Cart plugin can help you create a feature-rich online store on your WordPress-powered website or blog. Some are so advanced that you can get your store up and running in minutes. Some plugins allow you to take ecommerce to a next level with their high end customization tools. Here is a list of best WP shopping cart plugins available: Cart66 One of the best WordPress plugin with lots of features, great quality and ease of use. It accepts few more payment getways such as PayPal Website Payments Standard, PayPal Website Payments Professional, PayPal Express Checkout, eProcessing Network etc. It has flexible design options, recurring payments for subscriptions, memberships, and payment plans, Easy PCI Compliance – Safe and Secure. It is fast and efficient, one can sell digital and physical products and support is good. Price: Standard $49 & Professional $99 Details Download StorePress StorePress is a WordPress theme, which is fully coded. It comes with scripts that can change a WordPress blog into a veritable e-commerce virtual store. With this great premium WordPress theme, one can start affiliate stores, or promote affiliate products. Price: Single $59.99 & Developer License $119.99 Details Download WordPress eStore Plugin This shopping cart plugin comes with easy checkout, ease of design and use, automatic instant digital product delivery, Next Gen gallery integration, autoresponder integration etc. It is a lightweight shopping cart and allows multi site license. This plugin offers an amazingly comprehensive toolkit that will ensure your online shop is almost just plug-and-play. Price: $49.99 Details Download Shoppers Press Shoppers press is a premium cart for Word Press that comes with 20+ to choose from and 20+ built in payment gateways. It features one-click setups, personalized user accounts, easy management tools, detailed sales tracking, promotional options, a variety of product import tools, and many more features Price:$79 Details Download WordPress Shopping Cart plugin The WordPress Shopping Cart plugin by Tribulant quickly and seamlessly integrates an online shop with a fully functional shopping cart interface into any WordPress website. It has easy to use interface, which enables set up of multiple products and categorize and organizing them into multiple product categories. It also has many more attractive features. Price: $49.99 Details Download WP e-commerce WP e-commerce is a free full-featured shopping cart plugin for WordPress. It is a full featured shopping cart and boasts of easy checkout. It offers a wide range of features including SSL compatibility, customization and merchandising, integrated payment processing solutions including manual payment, Google Checkout and PayPal Payments, and email marketing. It is wordpress and social networking integrated. It is customizable by use of PHP template tag, wordpress shortcode and widgets. Details Download YAK for WordPress YAK is an open source shopping cart plugin for WordPress. It associates products with weblog entries (in other words, posts), so the post ID also becomes the product code. It supports both pages and posts as products, handles different types of product through categories. YAK supports downloadable products, so any e-books, plugins, or zip files you’re marketing can be easily purchased and dowloaded. Details Download Market Press It is another shopping cart full of many features. It offers following features such as assign categories and tags to products to make them easy to find, stock tracking with alerts, order management/alerts, fully customizable email messages, full support for most major currencies, fully customizable store urls/slugs, customers can checkout without being a site user etc. Expensive, but good option for those who can afford it. Price: $17.42/month Details Download Shopp It is an excellent shopping cart plugin for Word Press. This plugin is extremely easy to install and use. It has a cleaner interface. The customer support is good. Use can easily customize the look of the cart by using its amazing features. Price: $55 Details Download Related posts:8 PHP Shopping Cart Software for Reliable Ecommerce Solution Shopping Cart SEO 8 Free Open Source Shopping Carts

    Read the article

  • So No TECH job so far.

    - by Ratman21
    O I found some temp work for the US Census and I have managed to keep the house (so far) but, it looks like I/we are going to have to do a short sale and the temp job will be ending soon.   On top of that it looks like the unemployment fund for me is drying up. I will have about one month left after the Census job is done. I am now down to Appling for work at the KFC.   This is type a work I started with, before I was a tech geek and really I didn’t think I would be doing this kind of work in my later years but, I have a wife and kid. So I got to suck it up and do it.   Oh and here is my new resume…go ahead I know you want to tare it up. I really don’t care any more.   Scott L. Newman 45219 Dutton Way, Callahan, FL32011 H: (904)879-4880 C: (352)356-0945 E: [email protected] Web:  http://beingscottnewman.webs.com/                                                       ______                                                                                 OBJECTIVE To obtain a Network or Technical support position     KEYWORD SUMMARY CompTIA A+, Network+, and Security+ Certified., Network Operation, Technical Support, Client/Vendor Relations, Networking/Administration, Cisco Routers/Switches, Helpdesk, Microsoft Office Suite, Website Design/Dev./Management, Frame Relay, ISDN, Windows NT/98/XP, Visio, Inventory Management, CICS, Programming, COBOL IV, Assembler, RPG   QUALIFICATIONS SUMMARY Twenty years’ experience in computer operations, technical support, and technical writing. Also have two and half years’ experience in internet / intranet operations.   PROFESSIONAL EXPERIENCE October 2009 – Present*   Volunteer Web site and PC technician – Part time       True Faith Christian Fellowship Church – Callahan, FL, Project: Create and maintain web site for Church to give it a worldwide exposure Aug 2008 – September 2009:* Volunteer Church sound and video technician – Part time      Thomas Creek Baptist Church – Callahan, FL   *Note Jobs were for the learning and/or keeping updated on skills, while looking for a tech job and training for new skills.   February 2005 to October 2008: Client Server Dev/Analyst I, Fidelity National Information Services, Jacksonville, FL (FNIS acquired Certegy in 2005 and out of 20 personal, was one of three kept on.) August 2003 to February 2005: Senior NetOps Operator, Certegy, St.Pete, Fl. (August 2003, Certegy terminated contract with EDS and out of 40 personal, was one of six kept on.) Projects: Creation and update of listing and placement for all raised floor equipment at St.Pete site. Listing was made up of, floor plan of the raised floor and equipment racks diagrams showing the placement of all devices using Visio. This was cross-referenced with an inventory excel document showing what dept was responsible for each device. Sole creator of Network operation and Server Operation procedures guide (NetOps Guide).  Expertise: Resolving circuit and/or router issues or assist circuit carrier in resolving issue from the company Network Operation Center (NOC). As well as resolving application problems or assist application support in resolution of it.     July 1999 to August 2003: Senior NetOps Operator,EDS (Certegy Account), St.Pete, FL Same expertise and on going projects as listed above for FNIS/Certegy. (Equifax outsourced the NetOps dept. to EDS in 1999)         January 1991 to July 1999: NetOps/Tandem Operator, Equifax, St.Pete & Tampa, FL Same as all of the above for FNIS/Certegy/EDS except for circuit and router issues   EDUCATION ? New Horizons Computer Learning Center, Jacksonville, Florida - CompTIA A+, Security+, and     Network+ Certified.                        Currently working on CCNA Certification 07/30/10 ? Mott Community College, Flint, Michigan – Associates Degree - Data Processing and General Education ? Currently studying Japanese

    Read the article

  • Oracle Database 12 c Training and Certification: What’s in it for Me?

    - by KJones
    Oracle Database 12c has officially launched! Through Oracle University, our expert instructors can introduce you to the features and functions of this new Oracle Database 12c product. Through training courses and certification exam prep seminars, you can build up your database knowledge and apply this knowledge to advance your career. Already an Oracle Database Expert? Why Oracle Database 12c Training is still a Good Idea Oracle is the industry leader for database technology and takes the release of new products very seriously. We continue to listen to customer needs and add features and functionality to address those needs. Oracle Database 12c is no exception. The following areas have been greatly enhanced and should be considered for your additional training or certification: • Database for Cloud Computing • Compression and Information Lifecycle Management (ILM) • Improved Performance & Scalability • Extreme Availability • Security Defense in Depth • Manageability Oracle Certified Database Administrators Reap Career Rewards Becoming an expert user of database technology through Oracle University's certification program widens your skill set to demonstrate your expertise implementing the most advanced database technology available. By doing so, you'll make yourself a more marketable employee and candidate in the job market.  Reasons to Become an Oracle Certified Database Administrator of Oracle Database 12c: • The new Oracle Database 12c certifications emphasize more advanced skills that align with industry standards, resulting in an even more valuable credential for customers and partners. • The Oracle Certified Associate (OCA) for Oracle Database 12c centers upon certification objectives that measure IT professionals' day-to-day skills, along with your ability to manage challenges. • Building upon all of the competencies incorporated into Oracle's Database 12c OCA certification, the Oracle Certified Professional (OCP) for Oracle Database 12c certification includes advanced knowledge and skills required of top-performing database administrators. • The Oracle Certified Master (OCM) for Oracle Database 12c - a very challenging and elite top-level certification - certifies the most highly skilled and experienced database experts. • Oracle offers 12c upgrade paths for existing Oracle Certified Professionals (OCP) and Oracle Certified Masters (OCM). Database 12c Training and Certification: Built with Your Input When creating Oracle Database 12c training courses and certifications, we wanted to know which tasks are most important in a DBA's day-to-day work. Instead of assuming what those tasks might be, we decided to develop a job task analysis survey for DBAs. The response rate from DBAs from around the world was overwhelming! The survey focused on the following key database areas: • DBA Core Essentials • Database Storage • High Availability • Scalability • Networking • Security • Very Large Database Administration • Distributed Databases After conducting this survey, we took this specific feedback and used it to help mold the new Oracle Database 12c training and certification curriculum. The benefit to you? You now have access to Oracle Database 12c courses and certification exams that were created with your specific on-the-job tasks in mind. Explore Oracle Database 12c Training & Certification Today Investing in Oracle Database 12c training courses and certifications will help you develop a great deal of knowledge, experience and expertise. Explore our portfolio of offerings to determine which skills you need as a DBA to get up-to-speed on Oracle Database 12c technology. Questions or comments about the new Oracle Database 12c offerings? Let us know in the comments below. - Diana Gray, Principle Curriculum Product Manager and Raza Siddiqui, Senior Principle Curriculum Product Manager

    Read the article

  • Talking JavaOne with Rock Star Simon Ritter

    - by Janice J. Heiss
    Oracle’s Java Technology Evangelist Simon Ritter is well known at JavaOne for his quirky and fun-loving sessions, which, this year include: CON4644 -- “JavaFX Extreme GUI Makeover” (with Angela Caicedo on how to improve UIs in JavaFX) CON5352 -- “Building JavaFX Interfaces for the Real World” (Kinect gesture tracking and mind reading) CON5348 -- “Do You Like Coffee with Your Dessert?” (Some cool demos of Java of the Raspberry Pi) CON6375 -- “Custom JavaFX Charts: (How to extend JavaFX Chart controls with some interesting things) I recently asked Ritter about the significance of the Raspberry Pi, the topic of one of his sessions that consists of a credit card-sized single-board computer developed in the UK with the intention of stimulating the teaching of basic computer science in schools. “I don't think there's one definitive thing that makes the RP significant,” observed Ritter, “but a combination of things that really makes it stand out. First, it's the cost: $35 for what is effectively a completely usable computer. OK, so you have to add a power supply, SD card for storage and maybe a screen, keyboard and mouse, but this is still way cheaper than a typical PC. The choice of an ARM processor is also significant, as it avoids problems like cooling (no heat sink or fan) and can use a USB power brick.  Combine these two things with the immense groundswell of community support and it provides a fantastic platform for teaching young and old alike about computing, which is the real goal of the project.”He informed me that he’ll be at the Raspberry Pi meetup on Saturday (not part of JavaOne). Check out the details here.JavaFX InterfacesWhen I asked about how JavaFX can interface with the real world, he said that there are many ways. “JavaFX provides you with a simple set of programming interfaces that can create complex, cool and compelling user interfaces,” explained Ritter. “Because it's just Java code you can combine JavaFX with any other Java library to provide data to display and control the interface. What I've done for my session is look at some of the possible ways of doing this using some of the amazing hardware that's available today at very low cost. The Kinect sensor has added a new dimension to gaming in terms of interaction; there's a Java API to access this so you can easily collect skeleton tracking data from it. Some clever people have also written libraries that can track gestures like swipes, circles, pushes, and so on. We use these to control parts of the UI. I've also experimented with a Neurosky EEG sensor that can in some ways ‘read your mind’ (well, at least measure some of the brain functions like attention and meditation).  I've written a Java library for this that I include as a way of controlling the UI. We're not quite at the stage of just thinking a command though!” Here Comes Java EmbeddedAnd what, from Ritter’s perspective, is the most exciting thing happening in the world of Java today? “I think it's seeing just how Java continues to become more and more pervasive,” he said. “One of the areas that is growing rapidly is embedded systems.  We've talked about the ‘Internet of things’ for many years; now it's finally becoming a reality. With the ability of more and more devices to include processing, storage and networking we need an easy way to write code for them that's reliable, has high performance, and is secure. Java fits all these requirements. With Java Embedded being a conference within a conference, I'm very excited about the possibilities of Java in this space.”Check out Ritter’s sessions or say hi if you run into him. Originally published on blogs.oracle.com/javaone.

    Read the article

  • Talking JavaOne with Rock Star Simon Ritter

    - by Janice J. Heiss
    Oracle’s Java Technology Evangelist Simon Ritter is well known at JavaOne for his quirky and fun-loving sessions, which, this year include: CON4644 -- “JavaFX Extreme GUI Makeover” (with Angela Caicedo on how to improve UIs in JavaFX) CON5352 -- “Building JavaFX Interfaces for the Real World” (Kinect gesture tracking and mind reading) CON5348 -- “Do You Like Coffee with Your Dessert?” (Some cool demos of Java of the Raspberry Pi) CON6375 -- “Custom JavaFX Charts: (How to extend JavaFX Chart controls with some interesting things) I recently asked Ritter about the significance of the Raspberry Pi, the topic of one of his sessions that consists of a credit card-sized single-board computer developed in the UK with the intention of stimulating the teaching of basic computer science in schools. “I don't think there's one definitive thing that makes the RP significant,” observed Ritter, “but a combination of things that really makes it stand out. First, it's the cost: $35 for what is effectively a completely usable computer. OK, so you have to add a power supply, SD card for storage and maybe a screen, keyboard and mouse, but this is still way cheaper than a typical PC. The choice of an ARM processor is also significant, as it avoids problems like cooling (no heat sink or fan) and can use a USB power brick.  Combine these two things with the immense groundswell of community support and it provides a fantastic platform for teaching young and old alike about computing, which is the real goal of the project.”He informed me that he’ll be at the Raspberry Pi meetup on Saturday (not part of JavaOne). Check out the details here.JavaFX InterfacesWhen I asked about how JavaFX can interface with the real world, he said that there are many ways. “JavaFX provides you with a simple set of programming interfaces that can create complex, cool and compelling user interfaces,” explained Ritter. “Because it's just Java code you can combine JavaFX with any other Java library to provide data to display and control the interface. What I've done for my session is look at some of the possible ways of doing this using some of the amazing hardware that's available today at very low cost. The Kinect sensor has added a new dimension to gaming in terms of interaction; there's a Java API to access this so you can easily collect skeleton tracking data from it. Some clever people have also written libraries that can track gestures like swipes, circles, pushes, and so on. We use these to control parts of the UI. I've also experimented with a Neurosky EEG sensor that can in some ways ‘read your mind’ (well, at least measure some of the brain functions like attention and meditation).  I've written a Java library for this that I include as a way of controlling the UI. We're not quite at the stage of just thinking a command though!” Here Comes Java EmbeddedAnd what, from Ritter’s perspective, is the most exciting thing happening in the world of Java today? “I think it's seeing just how Java continues to become more and more pervasive,” he said. “One of the areas that is growing rapidly is embedded systems.  We've talked about the ‘Internet of things’ for many years; now it's finally becoming a reality. With the ability of more and more devices to include processing, storage and networking we need an easy way to write code for them that's reliable, has high performance, and is secure. Java fits all these requirements. With Java Embedded being a conference within a conference, I'm very excited about the possibilities of Java in this space.”Check out Ritter’s sessions or say hi if you run into him.

    Read the article

  • Smarter, Faster, Cheaper: The Insurance Industry’s Dream

    - by Jenna Danko
    On June 3rd, I saw the Gaylord Resort Centre in Washington D.C. become the hub of C level executives and managers of insurance carriers for the IASA 2013 Conference.  Insurance Accounting/Regulation and Technology sessions took the focus, but there were plenty of tertiary sessions for career development, which complemented the overall strong networking side of the conference.  As an exhibitor, Oracle, along with several hundred other product providers, welcomed the opportunity to display and demonstrate our solutions and we were encouraged by hustle and bustle of the exhibition floor.  The IASA organizers had pre-arranged fast track tours whereby interested conference delegates could sign up for a series of like-themed presentations from Vendors, giving them a level of 'Speed Dating' introductions to possible solutions and services.  Oracle participated in a number of these, which were very well subscribed.  Clearly, the conference had a strong business focus; however, attendees saw technology as a key enabler to get their processes done smarter, faster and cheaper.  As we navigated through the exhibition, it became clear from the inquiries that came to us that insurance carriers are gravitating to a number of focus areas: Navigating the maze of upcoming regulatory reporting changes. For US carriers with European holdings, Solvency II carries a myriad of rules and reporting requirements. Alignment across the globe of the Own Risk and Solvency Assessment (ORSA) processes brings to the fore the National Insurance of Insurance commissioners' (NAIC) recent guidance manual publication. Doing more with less and to certainly expect more from technology for less dollars. The overall cost of IT, in particular hardware, has dropped in real terms (though the appetite for more has risen: more CPU, more RAM, more storage), but software has seen less change. Clearly, customers expect either to pay less or get a lot more from their software solutions for the same buck. Doing things smarter – A recognition that with the advance of technology to stand still no longer means you are technically going backwards. Technology and, in particular technology interactions with human business processes, has undergone incredible change over the past 5 years. Consumer usage (iPhones, etc.) has been at the forefront, but now at the Enterprise level ever more effective technology exploitation is beginning to take place. That data and, in particular gleaning knowledge from data, is refining and improving business processes.  Organizations are now consuming more data than ever before, and it is set to grow exponentially for some time to come.  Amassing large volumes of data is one thing, but effectively analyzing that data is another.  It is the results of such analysis that leads to improvements both in terms of insurance product offerings and the processes to support them. Regulatory Compliance, damned if you do and damned if you don’t! Clearly, around the globe at lot is changing from a regulatory perspective and it is evident that in terms of regulatory requirements, whilst there is a greater convergence across jurisdictions bringing uniformity, there is also a lot of work to be done in the next 5 years. Just like the big data, hidden behind effective regulatory compliance there often lies golden nuggets that can give competitive advantages. From Oracle's perspective, our Rating Engine, Billing, Document Management and Insurance Analytics solutions on display served to strike up good conversations and, as is always the case at conferences, it was a great opportunity to meet and speak with existing Oracle customers that we might not have otherwise caught up with for a while. Fortunately, I was able to catch up on a few sessions at the close of the Exhibition.  The speaker quality was high and the audience asked challenging, but pertinent, questions.  During Dr. Jackie Freiberg’s keynote “Bye Bye Business as Usual,” the author discussed 8 strategies to help leaders create a culture where teams consistently deliver innovative ideas by disrupting the status quo.  The very first strategy: Get wired for innovation.  Freiberg admitted that folks in the insurance and financial services industry understand and know innovation is important, but oftentimes they are slow adopters.  Today, technology and innovation go hand in hand. In speaking to delegates during and after the conference, a high degree of satisfaction could be measured from their positive comments of speaker sessions and the exhibitors. I suspect many will be back in 2014 with Indianapolis as the conference location. Did you attend the IASA Conference in Washington D.C.?  If so, I would love to hear your comments. Andrew Collins is the Director, Solvency II of Oracle Financial Services. He can be reached at andrew.collins AT oracle.com.

    Read the article

  • Your Job Search Should be More Than Just a New Year's Resolution

    - by david.talamelli
    I love the beginning of a new year, it is a great chance to refocus and either re-evaluate goals you are working to or even set new ones. I don't have any statistics to measure this but I am sure that one of the more popular new year's resolutions in the general workforce is to either get a new job or work to further develop one's career. I think this is a good idea, in today's competitive work force people should have a plan of what they want to do, what role they are after and how to get there. One common mistake I think many people make though is that a career plan shouldn't be a once a year thought. When people finish with the holiday season with their new year's resolution to find a new job fresh in their mind, you can see the enthusiasm and motivation a person has to make something happen. Emails are sent, calls are made, applications are made, networking is happening, etc..... Finding the right role that you are after however can be difficult, while it would be great if that dream role was available just at the time you happened to be looking for it - in reality this is not always the case. Job Seekers need to keep reminding themselves that while sometimes that dream job they are after is available at the same time they are looking, that also a Job search can be a difficult and long process. Many people who set out with the best of intentions in January to find a new job can soon lose interest in a job search if they do not immediately find a role. Just like the Christmas decorations are put away and the photos from New Year's are stored away - a Job Seeker's motivation may slowly decrease until that person finds themselves 12 months later in the same situation in same role and looking for that new opportunity again. Rather than just "going for it" and looking for a role in the month of January, a person's job search or career plan should be an ongoing activity and thought process that is constantly updated and evaluated over the course of the year. It can be hard to stay motivated over an extended period of time, especially when you are newly motivated and ready for that new role and the results are not immediate. Rather than letting your job search fall down the priority list and into the "too hard basket" a few ideas that may keep your enthusiasm fresh Update your resume every 6 months, even if you are not looking for a job - it is easy to forget what you have accomplished if you don't keep your details updated. Also it is good to be prepared and have a resume ready to go in case you do get an unexpected phone call for that 'dream job' you have been hoping for. Work out what you want out of your next role before you begin your job search - rather than aimlessly searching job ads or talking to people - think of the organisations or type of role you would like before you search. If you know what you are looking for it will be much easier to work out how to get there than if you do not know what you want. Don't expect immediate results once you decide to look for another job, things don't always fall into place. Timing and delivery can be important pieces of being selected for a role, companies don't hire every role in January. Have an open mind - people you meet or talk to may not result in immediate results for your job search but every connection may help you get a bit closer to what you are after . These actions will not guarantee a positive result, but in today's competitive work force every little of extra preparation and planning helps. All the best for 2011 and I hope your career plan whatever it may be is a success.

    Read the article

  • OSI Model

    - by kaleidoscope
    The Open System Interconnection Reference Model (OSI Reference Model or OSI Model) is an abstract description for layered communications and computer network protocol design. In its most basic form, it divides network architecture into seven layers which, from top to bottom, are the Application, Presentation, Session, Transport, Network, Data Link, and Physical Layers. It is therefore often referred to as the OSI Seven Layer Model. A layer is a collection of conceptually similar functions that provide services to the layer above it and receives service from the layer below it. Description of OSI layers: Layer 1: Physical Layer ·         Defines the electrical and physical specifications for devices. In particular, it defines the relationship between a device and a physical medium. ·         Establishment and termination of a connection to a communications medium. ·         Participation in the process whereby the communication resources are effectively shared among multiple users. ·         Modulation or conversion between the representation of digital data in user equipment and the corresponding signals transmitted over a communications channel. Layer 2: Data Link Layer ·         Provides the functional and procedural means to transfer data between network entities. ·         Detect and possibly correct errors that may occur in the Physical Layer. The error check is performed using Frame Check Sequence (FCS). ·         Addresses is then sought to see if it needs to process the rest of the frame itself or whether to pass it on to another host. ·         The Layer is divided into two sub layers: The Media Access Control (MAC) layer and the Logical Link Control (LLC) layer. ·         MAC sub layer controls how a computer on the network gains access to the data and permission to transmit it. ·         LLC layer controls frame synchronization, flow control and error checking.   Layer 3: Network Layer ·         Provides the functional and procedural means of transferring variable length data sequences from a source to a destination via one or more networks. ·         Performs network routing functions, and might also perform fragmentation and reassembly, and report delivery errors. ·         Network Layer Routers operate at this layer—sending data throughout the extended network and making the Internet possible.   Layer 4: Transport Layer ·         Provides transparent transfer of data between end users, providing reliable data transfer services to the upper layers. ·         Controls the reliability of a given link through flow control, segmentation/de-segmentation, and error control. ·         Transport Layer can keep track of the segments and retransmit those that fail. Layer 5: Session Layer ·         Controls the dialogues (connections) between computers. ·         Establishes, manages and terminates the connections between the local and remote application. ·         Provides for full-duplex, half-duplex, or simplex operation, and establishes checkpointing, adjournment, termination, and restart procedures. ·         Implemented explicitly in application environments that use remote procedure calls. Layer 6: Presentation Layer ·         Establishes a context between Application Layer entities, in which the higher-layer entities can use different syntax and semantics, as long as the presentation service understands both and the mapping between them. The presentation service data units are then encapsulated into Session Protocol data units, and moved down the stack. ·         Provides independence from differences in data representation (e.g., encryption) by translating from application to network format, and vice versa. The presentation layer works to transform data into the form that the application layer can accept. This layer formats and encrypts data to be sent across a network, providing freedom from compatibility problems. It is sometimes called the syntax layer. Layer 7: Application Layer ·         This layer interacts with software applications that implement a communicating component. ·         Identifies communication partners, determines resource availability, and synchronizes communication. o       When identifying communication partners, the application layer determines the identity and availability of communication partners for an application with data to transmit. o       When determining resource availability, the application layer must decide whether sufficient network or the requested communication exists. o       In synchronizing communication, all communication between applications requires cooperation that is managed by the application layer. Technorati Tags: Kunal,OSI,Networking

    Read the article

  • Clustering for Mere Mortals (Pt2)

    - by Geoff N. Hiten
    Planning. I could stop there and let that be the entirety post #2 in this series.  Planning is the single most important element in building a cluster and the Laptop Demo Cluster is no exception.  One of the more awkward parts of actually creating a cluster is coordinating information between Windows Clustering and SQL Clustering.  The dialog boxes show up hours apart, but still have to have matching and consistent information. Excel seems to be a good tool for tracking these settings.  My workbook has four pages: Systems, Storage, Network, and Service Accounts.  The systems page looks like this:   Name Role Software Location East Physical Cluster Node 1 Windows Server 2008 R2 Enterprise Laptop VM West Physical Cluster Node 2 Windows Server 2008 R2 Enterprise Laptop VM North Physical Cluster Node 3 (Future Reserved) Windows Server 2008 R2 Enterprise Laptop VM MicroCluster Cluster Management Interface N/A Laptop VM SQL01 High-Performance High-Security Instance SQL Server 2008 Enterprise Edition x64 SP1 Laptop VM SQL02 High-Performance Standard-Security Instance SQL Server 2008 Enterprise Edition x64 SP1 Laptop VM SQL03 Standard-Performance High-Security Instance SQL Server 2008 Enterprise Edition x64 SP1 Laptop VM Note that everything that has a computer name is listed here, whether physical or virtual. Storage looks like this: Storage Name Instance Purpose Volume Path Size (GB) LUN ID Speed Quorum MicroCluster Cluster Quorum Quorum Q: 2     SQL01Anchor SQL01 Instance Anchor SQL01Anchor L: 2     SQL02Anchor SQL02 Instance Anchor SQL02Anchor M: 2     SQL01Data1 SQL01 SQL Data SQL01Data1 L:\MountPoints\SQL01Data1 2     SQL02Data1 SQL02 SQL Data SQL02Data1 M:\MountPoints\SQL02Data1       Starting at the left is the name used in the storage array.  It is important to rename resources at each level, whether it is Storage, LUN, Volume, or disk folder.  Otherwise, troubleshooting things gets complex and difficult.  You want to be able to glance at a resource at any level and see where it comes from and what it is connected to. Networking is the same way:   System Network VLAN  IP Subnet Mask Gateway DNS1 DNS2 East Public Cluster1 10.97.230.x(DHCP) 255.255.255.0 10.97.230.1 10.97.230.1 10.97.230.1 East Heartbeat Cluster2   255.255.255.0       West Public Cluster1 10.97.230.x(DHCP) 255.255.255.0 10.97.230.1 10.97.230.1 10.97.230.1 West Heartbeat Cluster2   255.255.255.0       North Public Cluster1 10.97.230.x(DHCP) 255.255.255.0 10.97.230.1 10.97.230.1 10.97.230.1 North Heartbeat Cluster2   255.255.255.0       SQL01 Public Cluster1 10.97.230.x(DHCP) 255.255.255.0       SQL02 Public Cluster1 10.97.230.x(DHCP) 255.255.255.0       One hallmark of a poorly planned and implemented cluster is a bunch of "Local Network Connection #n" entries in the network settings page.  That lets me know that somebody didn't care about the long-term supportabaility of the cluster.  This can be critically important with Hyper-V Clusters and their high NIC counts.  Final page:   Instance Service Name Account Password Domain OU SQL01 SQL Server SVCSQL01 Baseline22 MicroAD Service Accounts SQL01 SQL Agent SVCSQL01 Baseline22 MicroAD Service Accounts SQL02 SQL Server SVC_SQL02 Baseline22 MicroAD Service Accounts SQL02 SQL Agent SVC_SQL02 Baseline22 MicroAD Service Accounts SQL03 (Future) SQL Server SVC_SQL03 Baseline22 MicroAD Service Accounts SQL03 (Future) SQL Agent SVC_SQL03 Baseline22 MicroAD Service Accounts             Installation Account           administrator            Yes.  I write down the account information.  I secure the file via NTFS, but I don't want to fumble around looking for passwords when it comes time to rebuild a node. Always fill out the workbook COMPLETELY before installing anything.  The whole point is to have everything you need at your fingertips before you begin.  The install experience is so much better and more productive with this information in place.

    Read the article

  • How to Watch NCAA March Madness Online

    - by DigitalGeekery
    You’ve filled out your brackets and now you are ready for one of America’s most popular sporting events. But what if you are you stuck at work or away from your TV?  Or your local affiliate is showing a different game? Today we show how to catch all the March Madness online. March Madness on Demand You’ll need a broadband connection, 512 MB RAM or higher, with cookies and Javascript enabled in your browser. March Madness on Demand offers two viewing options, a Standard Player and a High Quality player. The High Quality player is not, unfortunately, high definition. Standard Player Requirements Windows XP/Vista/7 or Mac OS X IE 6+ (We also successfully tested it in Firefox, Chrome, & Opera) Adobe Flash Player 9 or higher High Quality Player Requirements 2.4 GHz Pentium 4 or Intel-based Macintosh Mac OS 10.4.8+ (Intel-based) Windows: XP SP2, Vista, Server 2003, Server 2008, Windows 7 Firefox 1.5+ or IE 6/7/8 Silverlight 3 browser plug-in Watching March Madness on Demand Go to the March Madness on Demand website. (Link below) Check the “Watch in High Quality” section to see if your browser is ready and compatible for the High Quality viewer. If not, you’ll see a message indicating either your browser and system are incompatible… Or that you need to install Silverlight. To install Silverlight, click on the “Get HQ” button and follow the prompts to download and install Silverlight. To launch the player, click the large red “Launch Player” button. At the top of the screen, you’ll see the current and upcoming games. Click on “Watch Now” below to begin watching. At the bottom left, is where you click to watch with the High Quality player. If to many people are watching the High Quality player, you’ll see the following message and have to go back to the Standard Player. At the lower right are volume controls, a “Full Screen” button, and a “Share” button which allows you to share the game you are watching on various social networking sites like Facebook and Twitter. Perhaps most importantly for those who want to steal a bit of viewing time while at work is the “Boss Button” at the top right. Clicking on the “Boss Button” will open a fake Office document so it may appear at first glance like you’re actually doing legitimate work. To return to the game, click anywhere on the screen with your mouse. You’ll be able to catch every single game of the tournament from the first round all the way through the championship with March Madness on Demand. If your computer and Internet connection can handle it, you can even watch multiple games at the same time by opening March Madness on Demand in multiple browser windows. Watch March Madness online Similar Articles Productive Geek Tips Weekend Fun: Watch Television on Your PC with AnyTVWatch NFL Sunday Night Football On Your PCWatch TV On Your PC with FreeZ Online TVGeek Fun: Download Favorite NBC Programs for FreeDitch the RealPlayer Bloat with Real Alternative TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 PCmover Professional How to Browse Privately in Firefox Kill Processes Quickly with Process Assassin Need to Come Up with a Good Name? Try Wordoid StockFox puts a Lightweight Stock Ticker in your Statusbar Explore Google Public Data Visually The Ultimate Excel Cheatsheet

    Read the article

  • Oracle at The Forrester Customer Intelligence and Marketing Leadership Forums

    - by Christie Flanagan
    The Forrester Customer Intelligence Forum and the Forrester Marketing Leadership Forums will soon be here.  This year’s events will be co-located on April 18-19 at the J.W. Marriott at the L.A. Live entertainment complex in downtown Los Angeles.  Last year’s Marketing Forum was quite memorable for me.  You see, while Forrester analysts and business marketers were busy mingling over at the Marriott, another marketing powerhouse was taking up residence a few feet away at The Staples Center.  That’s right folks. Lada Gaga was coming to town.  And, as I came to learn, it made perfect sense for Lady Gaga and her legions of fans to be sharing a small patch of downtown L.A. with marketing leaders from all over the world.  After all, whether you like Lady Gaga or not, what pop star in recent memory has done more to build herself into a brand and to create an engaging, social and interactive customer experience for her Little Monsters?  While Lady Gaga won’t be back in town for this year’s Forrester events, there are still plenty of compelling reasons to make the trip out to Los Angeles.   The theme for The Forrester Customer Intelligence and Marketing Leadership Forums this year is “From Cool To Critical: Creating Engagement In The Age Of The Customer” and will tackle the important questions about how marketers can survive and thrive in the age of the empowered customer: •    How can you assess consumer uptake of new innovations?•    How do you build deep customer knowledge to drive competitive advantage?•    How do you drive deep, personalized customer engagement?•    What is more valuable — eyeballs or engagement?•    How do business customers engage in new media types?•    How can you tie social data to corporate data?•    Who should lead the movement to customer obsession?•    How should you shift your planning and measurement approaches to accommodate more data and a higher signal-to-noise ratio?•    What role does technology play in customizing and synchronizing marketing efforts across channels?As a platinum sponsor of the event, there will be a numbers of ways to interact with Oracle while you’re attending the Forums.  Here are some of the highlights:Oracle Speaking SessionThursday, April 19, 9:15am – 9:55amMaximize Customer Engagement and Retention with Integrated Marketing & LoyaltyMelissa Boxer, Vice President, Oracle CRM Marketing & LoyaltyCustomers expect to interact with your company, brand and products in more ways than ever before.   New devices and channels, such as mobile, social and web, are creating radical shifts in the customer buying process and the ways your company can reach and communicate with existing and potential customers. While Marketing's objectives (attract, convert, retain) remain fundamentally the same, your approach and tools must adapt quickly to succeed in this more complex, cross-channel world. Hear how leading brands are using Oracle's integrated marketing and loyalty solutions to maximize customer engagement and retention through better planning, execution, and measurement of synchronized cross-channel marketing initiatives.Solution ShowcaseWednesday, April 1810:20am – 11:50am 12:30pm – 1:30pm2:55pm – 3:40pmThursday, April 199:55am – 10:40am12:00pm – 1:00pmSolution Showcase & Networking ReceptionWednesday, April 185:10pm – 6:20pmBe sure to follow the #webcenter hashtag for updates on these events.  And for a more considered perspective on what Lady Gaga can teach businesses about branding and customer experience, check out Denise Lee Yohn’s post, Lessons from Lady Gaga from the Brand as Business Bites blog.

    Read the article

< Previous Page | 295 296 297 298 299 300 301 302 303 304 305 306  | Next Page >