Search Results

Search found 55 results on 3 pages for 'arvind bhatt'.

Page 1/3 | 1 2 3  | Next Page >

  • SQL SERVER – Guest Post – Architecting Data Warehouse – Niraj Bhatt

    - by pinaldave
    Niraj Bhatt works as an Enterprise Architect for a Fortune 500 company and has an innate passion for building / studying software systems. He is a top rated speaker at various technical forums including Tech·Ed, MCT Summit, Developer Summit, and Virtual Tech Days, among others. Having run a successful startup for four years Niraj enjoys working on – IT innovations that can impact an enterprise bottom line, streamlining IT budgets through IT consolidation, architecture and integration of systems, performance tuning, and review of enterprise applications. He has received Microsoft MVP award for ASP.NET, Connected Systems and most recently on Windows Azure. When he is away from his laptop, you will find him taking deep dives in automobiles, pottery, rafting, photography, cooking and financial statements though not necessarily in that order. He is also a manager/speaker at BDOTNET, Asia’s largest .NET user group. Here is the guest post by Niraj Bhatt. As data in your applications grows it’s the database that usually becomes a bottleneck. It’s hard to scale a relational DB and the preferred approach for large scale applications is to create separate databases for writes and reads. These databases are referred as transactional database and reporting database. Though there are tools / techniques which can allow you to create snapshot of your transactional database for reporting purpose, sometimes they don’t quite fit the reporting requirements of an enterprise. These requirements typically are data analytics, effective schema (for an Information worker to self-service herself), historical data, better performance (flat data, no joins) etc. This is where a need for data warehouse or an OLAP system arises. A Key point to remember is a data warehouse is mostly a relational database. It’s built on top of same concepts like Tables, Rows, Columns, Primary keys, Foreign Keys, etc. Before we talk about how data warehouses are typically structured let’s understand key components that can create a data flow between OLTP systems and OLAP systems. There are 3 major areas to it: a) OLTP system should be capable of tracking its changes as all these changes should go back to data warehouse for historical recording. For e.g. if an OLTP transaction moves a customer from silver to gold category, OLTP system needs to ensure that this change is tracked and send to data warehouse for reporting purpose. A report in context could be how many customers divided by geographies moved from sliver to gold category. In data warehouse terminology this process is called Change Data Capture. There are quite a few systems that leverage database triggers to move these changes to corresponding tracking tables. There are also out of box features provided by some databases e.g. SQL Server 2008 offers Change Data Capture and Change Tracking for addressing such requirements. b) After we make the OLTP system capable of tracking its changes we need to provision a batch process that can run periodically and takes these changes from OLTP system and dump them into data warehouse. There are many tools out there that can help you fill this gap – SQL Server Integration Services happens to be one of them. c) So we have an OLTP system that knows how to track its changes, we have jobs that run periodically to move these changes to warehouse. The question though remains is how warehouse will record these changes? This structural change in data warehouse arena is often covered under something called Slowly Changing Dimension (SCD). While we will talk about dimensions in a while, SCD can be applied to pure relational tables too. SCD enables a database structure to capture historical data. This would create multiple records for a given entity in relational database and data warehouses prefer having their own primary key, often known as surrogate key. As I mentioned a data warehouse is just a relational database but industry often attributes a specific schema style to data warehouses. These styles are Star Schema or Snowflake Schema. The motivation behind these styles is to create a flat database structure (as opposed to normalized one), which is easy to understand / use, easy to query and easy to slice / dice. Star schema is a database structure made up of dimensions and facts. Facts are generally the numbers (sales, quantity, etc.) that you want to slice and dice. Fact tables have these numbers and have references (foreign keys) to set of tables that provide context around those facts. E.g. if you have recorded 10,000 USD as sales that number would go in a sales fact table and could have foreign keys attached to it that refers to the sales agent responsible for sale and to time table which contains the dates between which that sale was made. These agent and time tables are called dimensions which provide context to the numbers stored in fact tables. This schema structure of fact being at center surrounded by dimensions is called Star schema. A similar structure with difference of dimension tables being normalized is called a Snowflake schema. This relational structure of facts and dimensions serves as an input for another analysis structure called Cube. Though physically Cube is a special structure supported by commercial databases like SQL Server Analysis Services, logically it’s a multidimensional structure where dimensions define the sides of cube and facts define the content. Facts are often called as Measures inside a cube. Dimensions often tend to form a hierarchy. E.g. Product may be broken into categories and categories in turn to individual items. Category and Items are often referred as Levels and their constituents as Members with their overall structure called as Hierarchy. Measures are rolled up as per dimensional hierarchy. These rolled up measures are called Aggregates. Now this may seem like an overwhelming vocabulary to deal with but don’t worry it will sink in as you start working with Cubes and others. Let’s see few other terms that we would run into while talking about data warehouses. ODS or an Operational Data Store is a frequently misused term. There would be few users in your organization that want to report on most current data and can’t afford to miss a single transaction for their report. Then there is another set of users that typically don’t care how current the data is. Mostly senior level executives who are interesting in trending, mining, forecasting, strategizing, etc. don’t care for that one specific transaction. This is where an ODS can come in handy. ODS can use the same star schema and the OLAP cubes we saw earlier. The only difference is that the data inside an ODS would be short lived, i.e. for few months and ODS would sync with OLTP system every few minutes. Data warehouse can periodically sync with ODS either daily or weekly depending on business drivers. Data marts are another frequently talked about topic in data warehousing. They are subject-specific data warehouse. Data warehouses that try to span over an enterprise are normally too big to scope, build, manage, track, etc. Hence they are often scaled down to something called Data mart that supports a specific segment of business like sales, marketing, or support. Data marts too, are often designed using star schema model discussed earlier. Industry is divided when it comes to use of data marts. Some experts prefer having data marts along with a central data warehouse. Data warehouse here acts as information staging and distribution hub with spokes being data marts connected via data feeds serving summarized data. Others eliminate the need for a centralized data warehouse citing that most users want to report on detailed data. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Best Practices, Business Intelligence, Data Warehousing, Database, Pinal Dave, PostADay, Readers Contribution, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Integrating with Fusion Applications using SOAP web services and REST APIs (Part 1 of 2) by Arvind Srinivasamoorthy

    - by JuergenKress
    Fusion Applications provides several types of interfaces to facilitate integration with other applications within the enterprise and on the cloud.As one of the key integration interfaces, Fusion Applications (FA) supports SOAP services based integration, both inbound and outbound. At this point FA doesn’t provide REST API’s but it is planned for a future release. It is however possible to invoke external REST APIs from FA which we will discuss. Oracle continues to invest in improving both SOAP and REST based connectivity. The content in this blog is based on features that were available at the time of writing it. In this two part blog, I will cover the following topics briefly. Invoking FA SOAP web services from external applications Identifying the FA SOAP web service to be invoked Sample invocation from an external application Techniques to invoke FA services from an ADF application Invoking external SOAP Web Services from FA (covered in Part 2) Invoking external REST APIs from FA (covered in Part 2) I’ll touch upon some basics, so that you can quickly build a few SOAP/REST interactions with FA. If you do not already have access to an FA instance (on-premise or SaaS), you can request for a free 30 day trial of the Oracle Sales Cloud using http://cloud.oracle.com 1. Invoking FA SOAP web services from external applications There are two main types of services that FA exposes -  ADF Services - These services allow you to perform CRUD operations on Fusion business objects. For example, Sales Party Service, Opportunity Service etc. Using these services you can typically perform operations such as get, find, create, delete, update etc on FA objects.These services are typically useful for UI driven integrations such as looking up FA information from external application UIs, using third party Interfaces to create/update data in FA. They are also used in non-UI driven integration uses cases such as initial upload of business or setup data, synchronizing data with an external systems, etc. - Composite Services – These services involve more logic than CRUD and often involving human workflows, rules etc. These services perform a business function such as Get Orchestration Order Service and are used when building larger process based integrations with external systems.These services are usually asynchronous in nature and are not typically used for UI integration patterns. 1a. Identifying the FA SOAP web service to be invoked All FA web service metadata is available through an OER instance (Oracle Enterprise Repository) which is publicly available via http://fusionappsoer.oracle.com. This is the starting point for you to discover the services that you are going to work with. You do not need to own a FA account to browse the services using the above UI You can use the search area on the left to narrow down your search to what you are looking for. For example, you can choose the type as by ADF Services or Composite, you can narrow your search to a specific FA version, Product Family etc. Read the complete article here. SOA & BPM Partner Community For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Facebook Wiki Technorati Tags: AppAdvantage,SOA Community,Oracle SOA,Oracle BPM,Community,OPN,Jürgen Kress,Arvind Srinivasamoorthy

    Read the article

  • minimax depth first search game tree

    - by Arvind
    Hi I want to build a game tree for nine men's morris game. I want to apply minimax algorithm on the tree for doing node evaluations. Minimax uses DFS to evaluate nodes. So should I build the tree first upto a given depth and then apply minimax or can the process of building the tree and evaluation occur together in recursive minimax DFS? Thank you Arvind

    Read the article

  • Memory leak in xbap application

    - by Arvind
    Hi, We are using many custom controls by inheriting form the WPFcontrols as the base and customizing it for our need. However, the memory used by these controls are not released, even after pages using the controls are closed, until the whole application is closed. As these application has to work for a whole day performance decreases as more and more memory gets held up. When we profiled our page we found that the controls where not getting collected as there where some binding reference or some borders or brushes etc not getting cleared from that control. We tried to use the Unload event of the controls to remove the events and some references from the control. This reduced the leak to some extent but this was slowing down closing of the page also the unload event was getting triggered when the control was even collapsed. Is there any other ways to overcome the leak? Are there any best practices to prevent memory leaks? Thanks Arvind

    Read the article

  • Problem in installing Ubuntu 12 using USB

    - by Hitesh Bhatt
    I have a Sony laptop (VPCEH25EN) and i am trying to install Ubuntu 12.04 LTS. I downloaded the iso named "ubuntu-12.04-desktop-amd64.iso". For installing i created a bootable USB using UltraISO. When i booted the USB it hangs and shows " Booting from USB ", I even left it for hours and it didn't moved a bit. When I booted the ISO in Virtual Box, it ran well. I even used other tools to make a bootable USB, then it says " BOOTMNGR missing ". Please help, I am new to Ubuntu and my optical drive is fried. Thanks in advance..

    Read the article

  • Books and resources for Java Performance tuning - when working with databases, huge lists

    - by Arvind
    Hi All, I am relatively new to working on huge applications in Java. I am working on a Java web service which is pretty heavily used by various clients. The service basically queries the database (hibernate) and then works with a lot of Lists (there are adapters to convert list returned from DB to the interface which the service publishes) and I am seeing lot of issues with the service like high CPU usage or high heap space. While I can troubleshoot the performance issues using a profiler, I want to actually learn about what all I need to take care when I actually write code. Like what kind of List to use or things like using StringBuilder instead of String, etc... Is there any book or blogs which I can refer which will help me while I write new services? Also my application is multithreaded - each service call from a client is a new thread, and I want to know some best practices around that area as well. I did search the web but I found many tips which are not relevant in the latest Java 6 releases, so wanted to know what kind of resources would help a developer starting out now on Java for heavily used applications. Arvind

    Read the article

  • Useful Tips for BizTalk 2006 to BizTalk 2009 Porting

    - by Arvind Chaudhary
    BizTalk projects require some manual intervention in order to upgrade them. Execute the following steps to port a BizTalk solution / project: Open the project’s solution file (.sln) using a text editor – NotePad++ is recommended. Remove all the contents (in red below) between (not including) the following elements: GlobalSection(ProjectConfigurationPlatforms) = postSolution           {5C48CB6B-AE6F-4288-A8EE-46E352BB730C}.Debug|.NET.ActiveCfg = Debug|Any CPU           {5C48CB6B-AE6F-4288-A8EE-46E352BB730C}.Debug|.NET.Build.0 = Debug|Any CPU           {5C48CB6B-AE6F-4288-A8EE-46E352BB730C}.Debug|Any CPU.ActiveCfg = Debug|Any CPU           {5C48CB6B-AE6F-4288-A8EE-46E352BB730C}.Debug|Any CPU.Build.0 = Debug|Any CPU           … EndGlobalSection           You should see the following once you have removed the contents:      GlobalSection(ProjectConfigurationPlatforms) = postSolution                EndGlobalSection            Note: There should not be any   For each BizTalk project (.btproj) in the solution (.sln) find and replace the following in the .btproj file: ‘Name = “Debug”’ with ‘Name = “Development”’ ‘Name = “Release”’ with ‘Name = “Deployment”’ “bin\Debug” with “bin\Development” “bin\Release” with “bin\Deployment” Save the file.

    Read the article

  • Software index broken

    - by Arvind Gangwar
    When I was installing MTS Mblaz crossplatformui.deb for MTS data connect, its installed partial and shows error, and So I tried to uninstall "crossplatformui" but every time it showed following error. installArchives() failed: perl: warning: Setting locale failed. perl: warning: Please check that your locale settings: LANGUAGE = (unset), LC_ALL = (unset), LANG = "en_IN.ISO8859-1" are supported and installed on your system. perl: warning: Falling back to the standard locale ("C"). locale: Cannot set LC_CTYPE to default locale: No such file or directory locale: Cannot set LC_MESSAGES to default locale: No such file or directory locale: Cannot set LC_ALL to default locale: No such file or directory perl: warning: Setting locale failed. perl: warning: Please check that your locale settings: LANGUAGE = (unset), LC_ALL = (unset), LANG = "en_IN.ISO8859-1" are supported and installed on your system. perl: warning: Falling back to the standard locale ("C"). locale: Cannot set LC_CTYPE to default locale: No such file or directory locale: Cannot set LC_MESSAGES to default locale: No such file or directory locale: Cannot set LC_ALL to default locale: No such file or directory perl: warning: Setting locale failed. perl: warning: Please check that your locale settings: LANGUAGE = (unset), LC_ALL = (unset), LANG = "en_IN.ISO8859-1" are supported and installed on your system. perl: warning: Falling back to the standard locale ("C"). locale: Cannot set LC_CTYPE to default locale: No such file or directory locale: Cannot set LC_MESSAGES to default locale: No such file or directory locale: Cannot set LC_ALL to default locale: No such file or directory (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 205769 files and directories currently installed.) Removing crossplatformui ... ztemtvcdromd: no process found dpkg: error processing crossplatformui (--remove): subprocess installed post-removal script returned error exit status 1 No apport report written because MaxReports is reached already Errors were encountered while processing: crossplatformui Setting up firmware-b43-installer (4.150.10.5-5) ... --2012-06-01 14:11:21-- http://mirror2.openwrt.org/sources/broadcom-wl-4.150.10.5.tar.bz2 Resolving mirror2.openwrt.org... 46.4.11.11 Connecting to mirror2.openwrt.org|46.4.11.11|:80... failed: Connection refused. dpkg: error processing firmware-b43-installer (--configure): subprocess installed post-installation script returned error exit status 4

    Read the article

  • ubuntu 12.04 does not detect already installed windows 7

    - by arvind
    sir i have alreary installed windows 7 ultimate and when i tried to install ubuntu 12.04 it's installer didn't detect windows even i have only two primary and two logical partition on windows when i goes through try ubuntu and use command " os-prober" it shows output like as " unshare failed: Operation not permitted ERROR: you must be root ERROR: you must be root ERROR: you must be root ERROR: you must be root ERROR: you must be root ERROR: you must be root mkdir: cannot create directory /var/lib/os-prober/mount': Permission denied mkdir: cannot create directory/var/lib/os-prober/mount': Permission denied mkdir: cannot create directory /var/lib/os-prober/mount': Permission denied mkdir: cannot create directory/var/lib/os-prober/mount': Permission denied mkdir: cannot create directory `/var/lib/os-prober/mount': Permission denied" so plzzzz help me what should i do ???

    Read the article

  • Unable to access ubuntu through windows boot manager

    - by Arvind
    I am having an ASUS K55VM....it came with DOS....to this Windows 7 was installed...now i am planned to add another OS ie Ubuntu...my computer had 4 partitions of 240GB 230GB 230GB and 230GB....i changed this to 5 parts....243GB 230GB 230GB 150GB and an 80GB...i installed Ubuntu 12.04 to the 80GB partition through a USB flash drive....it installed fine...however the problem is during the Boot option in Windows Boot manager when i choose Ubuntu...this what is displayed... Windows failed to start. A recent hardware or software change might be the cause. To fix the problem: 1.Insert your Windows installation disc and restart your computer. 2.Choose your language settings, and then click "Next." 3.Click "Repair your computer." If you do not have this disc, contact your system administrator or computer manufacturer for assistance. File: \ubuntu\winboot\wubildr.mbr Status: 0xc000000e Info: The selected entry could not be loaded because the application is missing or corrupt. I tried to change the boot option for the OS and made Ubuntu as my first option....It worked...however windows 7 does not show up in GRUB....in this fear i did the following: Restarted my pc changed boot option 1 to windows. Formatted the drive to which Ubuntu was installed(80GB one)through windows. Restarted again. The problem persists....Windows boot manager shows Ubuntu still and when i choose this it gives two lines one like: grub rescue....but my windows 7 is working perfectly fine. Now want a solution to remove Ubuntu from the windows boot option and reinstall completely...

    Read the article

  • Graphics Driver problem, ATI Radeon HD 3200, small screen size and slows everything down.

    - by Arvind Jangid
    Regards. I am using a: 2009 Compaq Presario CQ40-415AU Notebook AMD Athlon X2 Dual core Processor 2.1 GHz 1024 MB L2 cache 3GB DDR2 RAM ATI Radeon + HD 3200 Graphics 256 MB, screen is 14 inch widescreen with resolution of 1280*800. I installed Ubuntu 12.04 LTS 32bit on my laptop. It works brilliantly until I installed graphics driver. When I installed the driver, the graphics became slow. Everything slowed down. Even the splash screen resolution changed to something like 640*480. I have liked Ubuntu since 9.10 and for the freedom it provides and its versatility, but graphics problem remains the same. I even installed Ubuntu on a 50 GB partition with 6 GB swap partition. My HDD is 320 GB. Please tell me what is wrong.

    Read the article

  • installed openstack using devstack install shell script but getting 500 error when i try opening dashboard

    - by Arvind
    I followed the instructions at http://devstack.org/guides/single-machine.html to install OpenStack. I first installed Ubuntu on my Windows 7 PC using the officially supported Windows installer for Ubuntu 12.04 LTS. And after that I followed the instructions at above page to install OpenStack. As per instructions, I should be able to access the dashboard aka Horizon, at http://192.168.1.4/ (thats the IP of the PC on which I installed Ubuntu-OpenStack). However I am getting a 500 error web page when I open that. How do I resolve this error? I want to set up a dev environment for OpenStack. For your ref, the whole error message is given now-- FilterError at / /usr/bin/env: node: No such file or directory Request Method: GET Request URL: http://192.168.1.4/ Django Version: 1.4.2 Exception Type: FilterError Exception Value: /usr/bin/env: node: No such file or directory Exception Location: /usr/local/lib/python2.7/dist-packages/compressor/filters/base.py in input, line 133 Python Executable: /usr/bin/python Python Version: 2.7.3 Python Path: ['/opt/stack/horizon/openstack_dashboard/wsgi/../..', '/opt/stack/python-keystoneclient', '/opt/stack/python-novaclient', '/opt/stack/python-openstackclient', '/opt/stack/keystone', '/opt/stack/glance', '/opt/stack/python-glanceclient/setuptools_git-0.4.2-py2.7.egg', '/opt/stack/python-glanceclient', '/opt/stack/nova', '/opt/stack/horizon', '/opt/stack/cinder', '/opt/stack/python-cinderclient', '/usr/local/lib/python2.7/dist-packages', '/usr/lib/python2.7', '/usr/lib/python2.7/plat-linux2', '/usr/lib/python2.7/lib-tk', '/usr/lib/python2.7/lib-old', '/usr/lib/python2.7/lib-dynload', '/usr/lib/python2.7/dist-packages', '/usr/lib/python2.7/dist-packages/PIL', '/usr/lib/python2.7/dist-packages/gst-0.10', '/usr/lib/python2.7/dist-packages/gtk-2.0', '/usr/lib/pymodules/python2.7', '/usr/lib/python2.7/dist-packages/ubuntu-sso-client', '/usr/lib/python2.7/dist-packages/ubuntuone-client', '/usr/lib/python2.7/dist-packages/ubuntuone-control-panel', '/usr/lib/python2.7/dist-packages/ubuntuone-couch', '/usr/lib/python2.7/dist-packages/ubuntuone-storage-protocol', '/opt/stack/horizon/openstack_dashboard'] Server time: Sat, 27 Oct 2012 08:43:29 +0000 Error during template rendering In template /opt/stack/horizon/openstack_dashboard/templates/_stylesheets.html, error at line 3 /usr/bin/env: node: No such file or directory 1 {% load compress %} 2 3 {% compress css %} 4 <link href='{{ STATIC_URL }}dashboard/less/horizon.less' type='text/less' media='screen' rel='stylesheet' /> 5 {% endcompress %} 6 7 <link rel="shortcut icon" href="{{ STATIC_URL }}dashboard/img/favicon.ico"/> 8 Also, the traceback is now given below-- Environment: Request Method: GET Request URL: http://192.168.1.4/ Django Version: 1.4.2 Python Version: 2.7.3 Installed Applications: ('openstack_dashboard', 'django.contrib.contenttypes', 'django.contrib.auth', 'django.contrib.sessions', 'django.contrib.messages', 'django.contrib.staticfiles', 'django.contrib.humanize', 'compressor', 'horizon', 'openstack_dashboard.dashboards.project', 'openstack_dashboard.dashboards.admin', 'openstack_dashboard.dashboards.settings', 'openstack_auth') Installed Middleware: ('django.middleware.common.CommonMiddleware', 'django.middleware.csrf.CsrfViewMiddleware', 'django.contrib.sessions.middleware.SessionMiddleware', 'django.contrib.auth.middleware.AuthenticationMiddleware', 'django.contrib.messages.middleware.MessageMiddleware', 'horizon.middleware.HorizonMiddleware', 'django.middleware.doc.XViewMiddleware', 'django.middleware.locale.LocaleMiddleware') Template error: In template /opt/stack/horizon/openstack_dashboard/templates/_stylesheets.html, error at line 3 /usr/bin/env: node: No such file or directory 1 : {% load compress %} 2 : 3 : {% compress css %} 4 : <link href='{{ STATIC_URL }}dashboard/less/horizon.less' type='text/less' media='screen' rel='stylesheet' /> 5 : {% endcompress %} 6 : 7 : <link rel="shortcut icon" href="{{ STATIC_URL }}dashboard/img/favicon.ico"/> 8 : Traceback: File "/usr/local/lib/python2.7/dist-packages/django/core/handlers/base.py" in get_response 111. response = callback(request, *callback_args, **callback_kwargs) File "/usr/local/lib/python2.7/dist-packages/django/views/decorators/vary.py" in inner_func 36. response = func(*args, **kwargs) File "/opt/stack/horizon/openstack_dashboard/wsgi/../../openstack_dashboard/views.py" in splash 38. return shortcuts.render(request, 'splash.html', {'form': form}) File "/usr/local/lib/python2.7/dist-packages/django/shortcuts/__init__.py" in render 44. return HttpResponse(loader.render_to_string(*args, **kwargs), File "/usr/local/lib/python2.7/dist-packages/django/template/loader.py" in render_to_string 176. return t.render(context_instance) File "/usr/local/lib/python2.7/dist-packages/django/template/base.py" in render 140. return self._render(context) File "/usr/local/lib/python2.7/dist-packages/django/template/base.py" in _render 134. return self.nodelist.render(context) File "/usr/local/lib/python2.7/dist-packages/django/template/base.py" in render 823. bit = self.render_node(node, context) File "/usr/local/lib/python2.7/dist-packages/django/template/debug.py" in render_node 74. return node.render(context) File "/usr/local/lib/python2.7/dist-packages/django/template/loader_tags.py" in render 155. return self.render_template(self.template, context) File "/usr/local/lib/python2.7/dist-packages/django/template/loader_tags.py" in render_template 137. output = template.render(context) File "/usr/local/lib/python2.7/dist-packages/django/template/base.py" in render 140. return self._render(context) File "/usr/local/lib/python2.7/dist-packages/django/template/base.py" in _render 134. return self.nodelist.render(context) File "/usr/local/lib/python2.7/dist-packages/django/template/base.py" in render 823. bit = self.render_node(node, context) File "/usr/local/lib/python2.7/dist-packages/django/template/debug.py" in render_node 74. return node.render(context) File "/usr/local/lib/python2.7/dist-packages/compressor/templatetags/compress.py" in render 147. return self.render_compressed(context, self.kind, self.mode, forced=forced) File "/usr/local/lib/python2.7/dist-packages/compressor/templatetags/compress.py" in render_compressed 107. rendered_output = self.render_output(compressor, mode, forced=forced) File "/usr/local/lib/python2.7/dist-packages/compressor/templatetags/compress.py" in render_output 119. return compressor.output(mode, forced=forced) File "/usr/local/lib/python2.7/dist-packages/compressor/css.py" in output 51. ret.append(subnode.output(*args, **kwargs)) File "/usr/local/lib/python2.7/dist-packages/compressor/css.py" in output 53. return super(CssCompressor, self).output(*args, **kwargs) File "/usr/local/lib/python2.7/dist-packages/compressor/base.py" in output 230. content = self.filter_input(forced) File "/usr/local/lib/python2.7/dist-packages/compressor/base.py" in filter_input 192. for hunk in self.hunks(forced): File "/usr/local/lib/python2.7/dist-packages/compressor/base.py" in hunks 167. precompiled, value = self.precompile(value, **options) File "/usr/local/lib/python2.7/dist-packages/compressor/base.py" in precompile 210. command=command, filename=filename).input(**kwargs) File "/usr/local/lib/python2.7/dist-packages/compressor/filters/base.py" in input 133. raise FilterError(err) Exception Type: FilterError at / Exception Value: /usr/bin/env: node: No such file or directory

    Read the article

  • keyboard key mapping gone haywire

    - by arvind
    I have a Sony VGN-CR353 running Windows 7 Ultimate. For typing purposes I use two keyboards: The Inbuilt Laptop Keyboard A Standard External Desktop Grade USB Keyboard Since yesterday, My inbuilt keyboard's keys have gone all awry. This is Similar to http://superuser.com/questions/11537/keyboard-keys-not-working-or-resulting-in-the-wrong-key But the high point is that the external keyboard is working just fine. I have already tried System Restore, Reinstalling Keyboard Drivers etc. but to no avail. This is really bugging. Please Help. Thanks in Advance.

    Read the article

  • Extracting DVDs

    - by Arvind Kohli
    I have some Movie DVDs, there are 3-4 movies in a DVD and I want to extract some particular movies from the DVDs to save them on another CD. Please let me know how can I do that. I am using windows xp

    Read the article

  • webmin bind issue- error when i try to start bind

    - by Arvind
    I am setting up a domain = mydomain.com with 2 nameservers ns1.mydomain.com and ns2.mydomain.com in Webmin - BIND. For this, I have registered 2 child nameservers at my domain registrar, and in Webmin-Bind I have set up a new zone for this domain. In this zone, i have specified 2 nameserver records- one each for ns1 and ns2. Also, I have defined 2 address records- one each for ns1.mydomain.com -> IP Address #1 and for ns2.mydomain.com -> IP Address #2 However when I try to start BIND in Webmin, I get the following error-- Failed to start BIND : Starting named: Error in named configuration: zone mydomain.com/IN: has no NS records zone mydomain.com/IN: not loaded due to errors. _default/mydomain.com/IN: bad zone [FAILED]

    Read the article

  • how to rotate one squid user among multiple IPs based on number of requests processed by each IP

    - by Arvind
    I want to set up a Squid ACL in the following manner-- For example, my Squid Proxy Server has 10 IP addresses- now I have a user 'demouser'. I want that for the very first request sent to 'demouser' this user uses IP address #1, for the second request it uses IP address #2, for the 3rd request of the day it uses IP address #3 and so on till it uses up all IPs. One more level of control I would like is that once the user has used up all available IP addresses once per address, then it does not allow the proxy request to go through. How do I set up such a configuration on Squid Proxy server ACL? Even a document or how to would be very helpful. The official wiki talks about one 'weird' case- choosing an IP address based on time of day the request was made to the proxy server. The other cases are all regular use cases which are not even remotely near my requirement as specified above.

    Read the article

  • Best way to fetch data from a single database table with multiple threads?

    - by Ravi Bhatt
    Hi, we have a system where we collect data every second on user activity on multiple web sites. we dump that data into a database X (say MS SQL Server). we now need to fetch data from this single table from daatbase X and insert into database Y (say mySql). we want to fetch time based data from database X through multiple threads so that we fetch as fast as we can. Once fetched and stored in database Y, we will delete data from database X. Are there any best practices on this sort of design? any specific things to take care on table design like sharing or something? Are there any other things that we need to take care to make sure we fetch it as fast as we can from threads running on multiple machines? Thanks in advance! Ravi

    Read the article

1 2 3  | Next Page >