Search Engine Indexing and Crawling

When user enters keywords in the search engine, a list of results are found that best-matches the keywords of the user. These results are found based on a huge amount of pages stored in the database of the search engine. To be able to obtain such results, some processes must be done first to build the database which will be searched then when the user does the search. Mainly two basic sub-processes are done to build the database:

1. Crawling: In the crawling (also called spidering or robotics) process, the search engine begins to discover the web or the pages of the overall sites on the web. It performs this by beginning to download pages from some sites. Regularly, it will begin with some small sites stored in its database. When it crawls some initial site, it will observe the links in those sites. If the search engine discovered new … Read More

Moving Microsoft Dynamics GP From One SQL Server to Another

If your company deploys Microsoft Great Plains as corporate ERP, at some day you will come to this IT routine on how to migrate your accounting/MRP system from one physical computer to another. As IT person you may or may not have a lot of GP architecture knowledge, this is why we would like here to orient you in this subject of transferring. We would like to stress here, that this routine of transfer typically is something that should be done by professional consultant, so you do not have us responsible for consequences if you have complications: customization, reports modification, complex integrations to name a few

o Great Plains Architecture. Microsoft Dynamics GP is MS SQL Server based application. GP workstation connects to SQL database via ODBC, this means that when migration is done, you should update ODBC connection settings to point out to new SQL Server

o GP Security … Read More

  • Partner links