Surprised no answers with so many votes/views, so I'll bite:
- This would be dependant on the old POS system, massage the data during import.
- Familiarize yourself with
Varien_Io
, particularly Varien_Io_File
. Since you'll most likely be dealing with such a large collection of data, keep in mind to use streams such as StreamReadCsv
and StreamWriteCsv
. More details on a "stream". Without a stream or linear read/write you may run into memory issues with other load/write methods.
With the above said here is an example: (source Atwix.com)
/**
* Generates CSV file with product's list according to the collection in the $this->_list
* @return array
*/
public function generateMlnList()
{
if (!is_null($this->_list)) {
$items = $this->_list->getItems();
if (count($items) > 0) {
$io = new Varien_Io_File();
$path = Mage::getBaseDir('var') . DS . 'export' . DS;
$name = md5(microtime());
$file = $path . DS . $name . '.csv';
$io->setAllowCreateFolders(true);
$io->open(array('path' => $path));
$io->streamOpen($file, 'w+');
$io->streamLock(true);
$io->streamWriteCsv($this->_getCsvHeaders($items));
foreach ($items as $product) {
$io->streamWriteCsv($product->getData());
}
return array(
'type' => 'filename',
'value' => $file,
'rm' => true // can delete file after use
);
}
}
}
As for importing orders, this example has helped most: (Source: pastebin)
<?php
require_once 'app/Mage.php';
Mage::app();
$quote = Mage::getModel('sales/quote')
->setStoreId(Mage::app()->getStore('default')->getId());
if ('do customer orders') {
// for customer orders:
$customer = Mage::getModel('customer/customer')
->setWebsiteId(1)
->loadByEmail('customer@example.com');
$quote->assignCustomer($customer);
} else {
// for guesr orders only:
$quote->setCustomerEmail('customer@example.com');
}
// add product(s)
$product = Mage::getModel('catalog/product')->load(8);
$buyInfo = array(
'qty' => 1,
// custom option id => value id
// or
// configurable attribute id => value id
);
$quote->addProduct($product, new Varien_Object($buyInfo));
$addressData = array(
'firstname' => 'Test',
'lastname' => 'Test',
'street' => 'Sample Street 10',
'city' => 'Somewhere',
'postcode' => '123456',
'telephone' => '123456',
'country_id' => 'US',
'region_id' => 12, // id from directory_country_region table
);
$billingAddress = $quote->getBillingAddress()->addData($addressData);
$shippingAddress = $quote->getShippingAddress()->addData($addressData);
$shippingAddress->setCollectShippingRates(true)->collectShippingRates()
->setShippingMethod('flatrate_flatrate')
->setPaymentMethod('checkmo');
$quote->getPayment()->importData(array('method' => 'checkmo'));
$quote->collectTotals()->save();
$service = Mage::getModel('sales/service_quote', $quote);
$service->submitAll();
$order = $service->getOrder();
printf("Created order %s\n", $order->getIncrementId());
With the example you have now will be resource heavy, as there are Mage::getModel(...
calls in foreach loops which is bad practice, and will most likely either timeout, or fill up memory rather quickly. Especially if you have this wrapped in another foreach/while.
This...
foreach ($products as $productId=>$product) {
$_product = Mage::getModel('catalog/product')->load($productId);
Should look like:
$_product = Mage::getModel('catalog/product');
foreach ($products as $productId=>$product) {
$_product->load($productId);
I would not attempt to try and relate every CSV bits of data to Magento objects. It would be madness and a bit of overkill, keep with resource model entry points of $model->load(EntityId)
.
Also note if you are attempting to import over 100k+ orders I would be concerned for performance after the large imports as its necessary to keep MySQL tuned to handle such large volumes, not too mention if I'm not mistaken sales objects are still EAV based, and do not perform well under high volume/traffic. There is a reason Magento Enterprise has a Sales Order Archive module to pull old data out of the "transactional" sales order tables to prevent bloated/stale data that isn't needed for taking orders.
To Wrap: I would bring up the requirements and needs of the business to store such large data, if its purely reporting there are better alternatives to suite this than Magento.
This is more of a hack. Warning: you could accidentally delete items if you're not careful.
The getAllItems
method filters items out of the item collection and returns it.
In order to filter items out of the getAllItems
array you simply need to mark the desired items in the collection as deleted:
foreach($order->getAllItems() as $item) {
$item->isDeleted(true);
}
Subsequent calls to getAllItems
will return items not marked as deleted.
Now, where things get hairy is that if you call save
on that item, it will delete it. This is dangerous.
So, what I suggest instead, is that you build your own collection with only the data that you know that you want. For instance:
$collection = new Varien_Data_Collection();
And then populate it:
foreach($order->getAllItems() as $item) {
//some condition
$collection->addItem($item);
}
Now your new collection only contains the data that you want, and you can hand that around.
Best Answer
in
Mage_Adminhtml_Model_Session_Quote
->getOrder() is not a magic method, but a real method exists in the class:From the logic above, is is apparent why you are getting a null'd order Object.
What you need to do is simply store the OrderId into the session object. When you call for getOrder(), it will then populate the order correctly in the result. See code in getOrder() method.
Thus use:
When things don't work as expected, go have a look at the class that you are using.
Not all methods are magic methods in magento, sometimes a method can be a magic method in once class, and not in another. (one of the features that can make magento code both confusing, and powerful to extend)