Merge lp:~openerp-dev/openobject-server/trunk-for-caribou-jam into lp:openobject-server
- trunk-for-caribou-jam
- Merge into trunk
Status: | Work in progress |
---|---|
Proposed branch: | lp:~openerp-dev/openobject-server/trunk-for-caribou-jam |
Merge into: | lp:openobject-server |
Diff against target: |
392 lines (+149/-51) 9 files modified
openerp-server (+2/-0) openerp/addons/base/ir/workflow/print_instance.py (+63/-37) openerp/modules/loading.py (+0/-2) openerp/osv/orm.py (+2/-2) openerp/tests/test_ir_sequence.py (+3/-0) openerp/tools/convert.py (+5/-2) openerp/workflow/instance.py (+10/-1) openerp/workflow/wkf_service.py (+13/-2) openerp/workflow/workitem.py (+51/-5) |
To merge this branch: | bzr merge lp:~openerp-dev/openobject-server/trunk-for-caribou-jam |
Related bugs: |
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
OpenERP Core Team | Pending | ||
Review via email: mp+118926@code.launchpad.net |
Commit message
Description of the change
- 4246. By Jigar A.
-
[MERGE] lp:openobject-server
- 4247. By Jigar A.
-
[MERGE] lp:openobject-server
- 4248. By Jigar A.
-
[MERGE] lp:openobject-server
- 4249. By Jigar A.
-
[MERGE] lp:~openerp-dev/openobject-server/trunk-for-caribou-demo-tpa
- 4250. By Jigar A.
-
merge lp:openobject-server
- 4251. By Jigar A.
-
merge lp:openobject-server
- 4252. By Jigar A.
-
[merge] lp:openobject-server
- 4253. By Jigar A.
-
[FIX] Workflow node fix, patch by Vo Minh Thu
- 4254. By Jigar A.
-
merge lp:openobject-server
- 4255. By Jigar A.
-
merge lp:openobject-server
- 4256. By Jigar A.
-
[FIX] remvoed conflicts
- 4257. By Vo Minh Thu
-
[IMP] workflow: only use already loaded transitions during an update.
- 4258. By Jigar A.
-
Merge lp:openobject-server
- 4259. By Vo Minh Thu
-
[MERGE] merged trunk.
- 4260. By Vo Minh Thu
-
[IMP] loading: only CSV are in a forced noupdate, other data rely on the noupdate attr. in the XML file.
- 4261. By Jigar A.
-
[MERGE] Merged lp:openobject-server
- 4262. By Jigar A.
-
[MERGE] Merged lp:opeenobject-server
- 4263. By Vo Minh Thu
-
[FIX] workflow: during an update consider only workflows from already loaded modules.
- 4264. By Vo Minh Thu
-
[MERGE] merged trunk.
- 4265. By Jigar A.
-
[MERGE] lp:openobject-server
- 4266. By Jigar A.
-
[MERGE] lp:openobject-server
- 4267. By Jigar A.
-
[MERGE] lp:openobject-server
- 4268. By Vo Minh Thu
-
[IMP] workflow: consider only transitions and activities from loaded modules (useful during an update).
- 4269. By Vo Minh Thu
-
[FIX] workflows: the similarly-named columns in results of a select * with a join in SQL queries are overwritten, so use explicit column names.
- 4270. By Vo Minh Thu
-
[MERGE] merged trunk.
- 4271. By Vo Minh Thu
-
[FIX] workflow: consider only transitions from already loaded modules.
- 4272. By Vo Minh Thu
-
[MERGE] merged trunk.
- 4273. By Jigar A.
-
[MERGE] lp:openobject-server
- 4274. By Jigar A.
-
[MERGE] Sync with Trunk
- 4275. By Jigar A.
-
[MERGE] Sync with Trunk
- 4276. By Olivier Laurent (Open ERP)
-
[MERGE] merge with trunk
- 4277. By Olivier Laurent (Open ERP)
-
[FIX] re-raise exception in case of error
Vo Minh Thu (thu) wrote : | # |
There is a leftover of some pydb import/call.
- 4278. By Vo Minh Thu
-
[REV] reverted leftover pydb call.
- 4279. By Vo Minh Thu
-
[MERGE]merged trunk.
Unmerged revisions
- 4279. By Vo Minh Thu
-
[MERGE]merged trunk.
- 4278. By Vo Minh Thu
-
[REV] reverted leftover pydb call.
- 4277. By Olivier Laurent (Open ERP)
-
[FIX] re-raise exception in case of error
- 4276. By Olivier Laurent (Open ERP)
-
[MERGE] merge with trunk
- 4275. By Jigar A.
-
[MERGE] Sync with Trunk
- 4274. By Jigar A.
-
[MERGE] Sync with Trunk
- 4273. By Jigar A.
-
[MERGE] lp:openobject-server
- 4272. By Vo Minh Thu
-
[MERGE] merged trunk.
- 4271. By Vo Minh Thu
-
[FIX] workflow: consider only transitions from already loaded modules.
- 4270. By Vo Minh Thu
-
[MERGE] merged trunk.
Preview Diff
1 | === modified file 'openerp-server' |
2 | --- openerp-server 2012-10-29 17:33:21 +0000 |
3 | +++ openerp-server 2012-11-09 14:23:22 +0000 |
4 | @@ -97,6 +97,7 @@ |
5 | registry.schedule_cron_jobs() |
6 | except Exception: |
7 | _logger.exception('Failed to initialize database `%s`.', dbname) |
8 | + raise |
9 | |
10 | def run_test_file(dbname, test_file): |
11 | """ Preload a registry, possibly run a test file, and start the cron.""" |
12 | @@ -110,6 +111,7 @@ |
13 | cr.close() |
14 | except Exception: |
15 | _logger.exception('Failed to initialize database `%s` and run test file `%s`.', dbname, test_file) |
16 | + raise |
17 | |
18 | def export_translation(): |
19 | config = openerp.tools.config |
20 | |
21 | === modified file 'openerp/addons/base/ir/workflow/print_instance.py' |
22 | --- openerp/addons/base/ir/workflow/print_instance.py 2012-02-02 12:54:42 +0000 |
23 | +++ openerp/addons/base/ir/workflow/print_instance.py 2012-11-09 14:23:22 +0000 |
24 | @@ -123,12 +123,35 @@ |
25 | processed_subflows = set() |
26 | graph_get(cr, graph, [x[0] for x in inst], nested, workitem_get(inst_id), processed_subflows) |
27 | |
28 | -# |
29 | -# TODO: pas clean: concurrent !!! |
30 | -# |
31 | +def create_dot_graph(cr, model_name, workflow_name, nested, inst_ids): |
32 | + import pydot |
33 | + graph = pydot.Dot( |
34 | + graph_name=model_name.replace('.','_'), |
35 | + fontsize='16', |
36 | + label="""\\n\\nWorkflow: %s\\n Model: %s""" % (workflow_name, model_name), |
37 | + size='7.3, 10.1', center='1', ratio='auto', rotate='0', rankdir='TB', |
38 | + ) |
39 | + for inst_id in inst_ids: |
40 | + graph_instance_get(cr, graph, inst_id[0], nested) |
41 | + return graph |
42 | + |
43 | +def write_dot_graph_svg(graph, filename): |
44 | + graph.write(filename, prog='dot', format='svg') |
45 | + |
46 | +def write_workflow_svg(cr, model_name, res_id, filename): |
47 | + cr.execute('select name from wkf where osv=%s limit 1', (model_name,)) |
48 | + workflow = cr.dictfetchone() |
49 | + cr.execute('select i.id from wkf_instance i left join wkf w on (i.wkf_id=w.id) where res_id=%s and osv=%s', (res_id, model_name)) |
50 | + instances = cr.fetchall() |
51 | + graph = create_dot_graph(cr, model_name, workflow['name'] if workflow else 'NO WORKFLOW', True, instances) |
52 | + write_dot_graph_svg(graph, filename) |
53 | |
54 | class report_graph_instance(object): |
55 | def __init__(self, cr, uid, ids, data): |
56 | + model_name = data['model'] |
57 | + res_id = data['id'] |
58 | + nested = data.get('nested', False) |
59 | + |
60 | try: |
61 | import pydot |
62 | except Exception,e: |
63 | @@ -136,48 +159,29 @@ |
64 | 'Import Error for pydot, you will not be able to render workflows.\n' |
65 | 'Consider Installing PyDot or dependencies: http://dkbza.org/pydot.html.') |
66 | raise e |
67 | + |
68 | + # TODO: make it thread-safe. |
69 | self.done = False |
70 | |
71 | + # Create a Dot graph and render it as a PS string. |
72 | try: |
73 | - cr.execute('select * from wkf where osv=%s limit 1', |
74 | - (data['model'],)) |
75 | + cr.execute('select name from wkf where osv=%s limit 1', (model_name,)) |
76 | wkfinfo = cr.dictfetchone() |
77 | + cr.execute('select i.id from wkf_instance i left join wkf w on (i.wkf_id=w.id) where res_id=%s and osv=%s', (res_id, model_name)) |
78 | + inst_ids = cr.fetchall() |
79 | if not wkfinfo: |
80 | - ps_string = '''%PS-Adobe-3.0 |
81 | -/inch {72 mul} def |
82 | -/Times-Roman findfont 50 scalefont setfont |
83 | -1.5 inch 15 inch moveto |
84 | -(No workflow defined) show |
85 | -showpage''' |
86 | + ps_string = NO_DEFINED_WORKFLOW |
87 | + elif not inst_ids: |
88 | + ps_string = NO_DEFINED_INSTANCE |
89 | else: |
90 | - cr.execute('select i.id from wkf_instance i left join wkf w on (i.wkf_id=w.id) where res_id=%s and osv=%s',(data['id'],data['model'])) |
91 | - inst_ids = cr.fetchall() |
92 | - if not inst_ids: |
93 | - ps_string = '''%PS-Adobe-3.0 |
94 | -/inch {72 mul} def |
95 | -/Times-Roman findfont 50 scalefont setfont |
96 | -1.5 inch 15 inch moveto |
97 | -(No workflow instance defined) show |
98 | -showpage''' |
99 | - else: |
100 | - graph = pydot.Dot(graph_name=data['model'].replace('.','_'), |
101 | - fontsize='16', |
102 | - label="""\\\n\\nWorkflow: %s\\n OSV: %s""" % (wkfinfo['name'],wkfinfo['osv']), |
103 | - size='7.3, 10.1', center='1', ratio='auto', rotate='0', rankdir='TB', |
104 | - ) |
105 | - for inst_id in inst_ids: |
106 | - inst_id = inst_id[0] |
107 | - graph_instance_get(cr, graph, inst_id, data.get('nested', False)) |
108 | - ps_string = graph.create(prog='dot', format='ps') |
109 | + graph = create_dot_graph(cr, model_name, wkfinfo['name'], nested, inst_ids) |
110 | + ps_string = graph.create(prog='dot', format='ps') |
111 | except Exception, e: |
112 | _logger.exception('Exception in call:') |
113 | - # string is in PS, like the success message would have been |
114 | - ps_string = '''%PS-Adobe-3.0 |
115 | -/inch {72 mul} def |
116 | -/Times-Roman findfont 50 scalefont setfont |
117 | -1.5 inch 15 inch moveto |
118 | -(No workflow available) show |
119 | -showpage''' |
120 | + # Don't raise the exception but set an error message in the PS string. |
121 | + ps_string = NO_WORKFLOW |
122 | + |
123 | + # Convert the PS string to a PDF file. |
124 | if os.name == "nt": |
125 | prog = 'ps2pdf.bat' |
126 | else: |
127 | @@ -188,6 +192,7 @@ |
128 | input.close() |
129 | self.result = output.read() |
130 | output.close() |
131 | + |
132 | self.done = True |
133 | |
134 | def is_done(self): |
135 | @@ -216,5 +221,26 @@ |
136 | |
137 | report_graph('report.workflow.instance.graph', 'ir.workflow') |
138 | |
139 | +NO_DEFINED_WORKFLOW='''%PS-Adobe-3.0 |
140 | +/inch {72 mul} def |
141 | +/Times-Roman findfont 50 scalefont setfont |
142 | +1.5 inch 15 inch moveto |
143 | +(No workflow defined) show |
144 | +showpage''' |
145 | + |
146 | +NO_DEFINED_INSTANCE='''%PS-Adobe-3.0 |
147 | +/inch {72 mul} def |
148 | +/Times-Roman findfont 50 scalefont setfont |
149 | +1.5 inch 15 inch moveto |
150 | +(No workflow instance defined) show |
151 | +showpage''' |
152 | + |
153 | +NO_WORKFLOW='''%PS-Adobe-3.0 |
154 | +/inch {72 mul} def |
155 | +/Times-Roman findfont 50 scalefont setfont |
156 | +1.5 inch 15 inch moveto |
157 | +(No workflow available) show |
158 | +showpage''' |
159 | + |
160 | # vim:expandtab:smartindent:tabstop=4:softtabstop=4:shiftwidth=4: |
161 | |
162 | |
163 | === modified file 'openerp/modules/loading.py' |
164 | --- openerp/modules/loading.py 2012-11-02 13:47:32 +0000 |
165 | +++ openerp/modules/loading.py 2012-11-09 14:23:22 +0000 |
166 | @@ -109,8 +109,6 @@ |
167 | pathname = os.path.join(module_name, filename) |
168 | fp = tools.file_open(pathname) |
169 | noupdate = False |
170 | - if kind in ('demo', 'demo_xml'): |
171 | - noupdate = True |
172 | try: |
173 | if ext == '.csv': |
174 | if kind in ('init', 'init_xml'): |
175 | |
176 | === modified file 'openerp/osv/orm.py' |
177 | --- openerp/osv/orm.py 2012-11-09 11:39:51 +0000 |
178 | +++ openerp/osv/orm.py 2012-11-09 14:23:22 +0000 |
179 | @@ -4147,7 +4147,7 @@ |
180 | # fields that are computer may refer (directly or indirectly) to |
181 | # parent_left/right (via a child_of domain) |
182 | if parents_changed: |
183 | - if self.pool._init: |
184 | + if self.pool._init and False: |
185 | self.pool._init_parent[self._name] = True |
186 | else: |
187 | order = self._parent_order or self._order |
188 | @@ -4373,7 +4373,7 @@ |
189 | upd_todo.sort(lambda x, y: self._columns[x].priority-self._columns[y].priority) |
190 | |
191 | if self._parent_store and not context.get('defer_parent_store_computation'): |
192 | - if self.pool._init: |
193 | + if self.pool._init and False: |
194 | self.pool._init_parent[self._name] = True |
195 | else: |
196 | parent = vals.get(self._parent_name, False) |
197 | |
198 | === modified file 'openerp/tests/test_ir_sequence.py' |
199 | --- openerp/tests/test_ir_sequence.py 2012-11-07 11:10:11 +0000 |
200 | +++ openerp/tests/test_ir_sequence.py 2012-11-09 14:23:22 +0000 |
201 | @@ -161,6 +161,9 @@ |
202 | ids = registry('ir.sequence').search(cr, ADMIN_USER_ID, |
203 | [('code', 'in', ['test_sequence_type_3', 'test_sequence_type_4'])], {}) |
204 | registry('ir.sequence').unlink(cr, ADMIN_USER_ID, ids, {}) |
205 | + ids = registry('ir.sequence.type').search(cr, ADMIN_USER_ID, |
206 | + [('code', 'in', ['test_sequence_type_3', 'test_sequence_type_4'])], {}) |
207 | + registry('ir.sequence.type').unlink(cr, ADMIN_USER_ID, ids, {}) |
208 | cr.commit() |
209 | cr.close() |
210 | |
211 | |
212 | === modified file 'openerp/tools/convert.py' |
213 | --- openerp/tools/convert.py 2012-10-01 14:49:41 +0000 |
214 | +++ openerp/tools/convert.py 2012-11-09 14:23:22 +0000 |
215 | @@ -508,10 +508,11 @@ |
216 | self.pool.get('ir.model.data').ir_set(cr, self.uid, res['key'], res['key2'], res['name'], res['models'], res['value'], replace=res.get('replace',True), isobject=res.get('isobject', False), meta=res.get('meta',None)) |
217 | |
218 | def _tag_workflow(self, cr, rec, data_node=None): |
219 | - if self.isnoupdate(data_node) and self.mode != 'init': |
220 | + w_ref = rec.get('ref','') |
221 | + if self.isnoupdate(data_node) and self.mode != 'init' and w_ref not in self.just_created: |
222 | return |
223 | model = str(rec.get('model','')) |
224 | - w_ref = rec.get('ref','') |
225 | + |
226 | if w_ref: |
227 | id = self.id_get(cr, w_ref) |
228 | else: |
229 | @@ -814,6 +815,7 @@ |
230 | |
231 | id = self.pool.get('ir.model.data')._update(cr, self.uid, rec_model, self.module, res, rec_id or False, not self.isnoupdate(data_node), noupdate=self.isnoupdate(data_node), mode=self.mode, context=rec_context ) |
232 | if rec_id: |
233 | + self.just_created.add(rec_id) |
234 | self.idref[rec_id] = int(id) |
235 | if config.get('import_partial', False): |
236 | cr.commit() |
237 | @@ -867,6 +869,7 @@ |
238 | report = assertion_report.assertion_report() |
239 | self.assertion_report = report |
240 | self.noupdate = noupdate |
241 | + self.just_created = set() |
242 | self._tags = { |
243 | 'menuitem': self._tag_menuitem, |
244 | 'record': self._tag_record, |
245 | |
246 | === modified file 'openerp/workflow/instance.py' |
247 | --- openerp/workflow/instance.py 2011-02-07 12:57:23 +0000 |
248 | +++ openerp/workflow/instance.py 2012-11-09 14:23:22 +0000 |
249 | @@ -29,7 +29,16 @@ |
250 | (uid,res_type,res_id) = ident |
251 | cr.execute('insert into wkf_instance (res_type,res_id,uid,wkf_id) values (%s,%s,%s,%s) RETURNING id', (res_type,res_id,uid,wkf_id)) |
252 | id_new = cr.fetchone()[0] |
253 | - cr.execute('select * from wkf_activity where flow_start=True and wkf_id=%s', (wkf_id,)) |
254 | + import openerp |
255 | + pool = openerp.modules.registry.RegistryManager.registries[cr.dbname] |
256 | + if pool._init: |
257 | + # Module init currently in progress, only consider activities from modules whose code was already loaded |
258 | + cr.execute("""select wkf_activity.id from wkf_activity left join ir_model_data md on |
259 | + (md.model = 'workflow.activity' and md.res_id = wkf_activity.id) |
260 | + where flow_start=True and wkf_id=%s and md.module in %s""", |
261 | + (wkf_id, tuple(pool._init_modules))) |
262 | + else: |
263 | + cr.execute('select id from wkf_activity where flow_start=True and wkf_id=%s', (wkf_id,)) |
264 | res = cr.dictfetchall() |
265 | stack = [] |
266 | workitem.create(cr, res, id_new, ident, stack=stack) |
267 | |
268 | === modified file 'openerp/workflow/wkf_service.py' |
269 | --- openerp/workflow/wkf_service.py 2011-09-24 14:52:58 +0000 |
270 | +++ openerp/workflow/wkf_service.py 2012-11-09 14:23:22 +0000 |
271 | @@ -101,7 +101,16 @@ |
272 | if res_type in self.wkf_on_create_cache[cr.dbname]: |
273 | wkf_ids = self.wkf_on_create_cache[cr.dbname][res_type] |
274 | else: |
275 | - cr.execute('select id from wkf where osv=%s and on_create=True', (res_type,)) |
276 | + import openerp |
277 | + pool = openerp.modules.registry.RegistryManager.registries[cr.dbname] |
278 | + if pool._init and pool._init_modules: |
279 | + # Module init currently in progress, only consider transitions from modules whose code was already loaded |
280 | + cr.execute("""select wkf.id from wkf left join ir_model_data md on |
281 | + (md.model = 'workflow' and md.res_id = wkf.id) |
282 | + where osv=%s and on_create=True and md.module in %s""", |
283 | + (res_type, tuple(pool._init_modules))) |
284 | + else: |
285 | + cr.execute('select id from wkf where osv=%s and on_create=True', (res_type,)) |
286 | wkf_ids = cr.fetchall() |
287 | self.wkf_on_create_cache[cr.dbname][res_type] = wkf_ids |
288 | for (wkf_id,) in wkf_ids: |
289 | @@ -120,7 +129,9 @@ |
290 | ident = (uid,res_type,res_id) |
291 | # ids of all active workflow instances for a corresponding resource (id, model_nam) |
292 | cr.execute('select id from wkf_instance where res_id=%s and res_type=%s and state=%s', (res_id, res_type, 'active')) |
293 | - for (id,) in cr.fetchall(): |
294 | + res = cr.fetchall() |
295 | + assert len(res) <= 1 # TODO make (res_type, res_id) unique? |
296 | + for (id,) in res: |
297 | res2 = instance.validate(cr, id, ident, signal) |
298 | result = result or res2 |
299 | return result |
300 | |
301 | === modified file 'openerp/workflow/workitem.py' |
302 | --- openerp/workflow/workitem.py 2011-12-11 10:21:40 +0000 |
303 | +++ openerp/workflow/workitem.py 2012-11-09 14:23:22 +0000 |
304 | @@ -44,7 +44,16 @@ |
305 | if stack is None: |
306 | raise 'Error !!!' |
307 | result = True |
308 | - cr.execute('select * from wkf_activity where id=%s', (workitem['act_id'],)) |
309 | + import openerp |
310 | + pool = openerp.modules.registry.RegistryManager.registries[cr.dbname] |
311 | + if pool._init: |
312 | + # Module init currently in progress, only consider activities from modules whose code was already loaded |
313 | + cr.execute("""select wkf_activity.id, split_mode, signal_send, kind, action, action_id, subflow_id from wkf_activity left join ir_model_data md on |
314 | + (md.model = 'workflow.activity' and md.res_id = wkf_activity.id) |
315 | + where wkf_activity.id=%s and md.module in %s""", |
316 | + (workitem['act_id'], tuple(pool._init_modules))) |
317 | + else: |
318 | + cr.execute('select id, split_mode, signal_send, kind, action, action_id, subflow_id from wkf_activity where id=%s', (workitem['act_id'],)) |
319 | activity = cr.dictfetchone() |
320 | |
321 | triggers = False |
322 | @@ -62,7 +71,14 @@ |
323 | triggers = triggers and not ok |
324 | |
325 | if triggers: |
326 | - cr.execute('select * from wkf_transition where act_from=%s', (workitem['act_id'],)) |
327 | + if pool._init: |
328 | + # Module init currently in progress, only consider transitions from modules whose code was already loaded |
329 | + cr.execute("""select wkf_transition.id, trigger_model, trigger_expr_id from wkf_transition left join ir_model_data md on |
330 | + (md.model = 'workflow.transition' and md.res_id = wkf_transition.id) |
331 | + where act_from=%s and md.module in %s""", |
332 | + (workitem['act_id'], tuple(pool._init_modules))) |
333 | + else: |
334 | + cr.execute('select id, trigger_model, trigger_expr_id from wkf_transition where act_from=%s', (workitem['act_id'],)) |
335 | alltrans = cr.dictfetchall() |
336 | for trans in alltrans: |
337 | if trans['trigger_model']: |
338 | @@ -149,7 +165,16 @@ |
339 | def _split_test(cr, workitem, split_mode, ident, signal=None, stack=None): |
340 | if stack is None: |
341 | raise 'Error !!!' |
342 | - cr.execute('select * from wkf_transition where act_from=%s', (workitem['act_id'],)) |
343 | + import openerp |
344 | + pool = openerp.modules.registry.RegistryManager.registries[cr.dbname] |
345 | + if pool._init: |
346 | + # Module init currently in progress, only consider transitions from modules whose code was already loaded |
347 | + cr.execute("""select wkf_transition.id, signal, group_id, condition from wkf_transition left join ir_model_data md on |
348 | + (md.model = 'workflow.transition' and md.res_id = wkf_transition.id) |
349 | + where act_from=%s and md.module in %s""", |
350 | + (workitem['act_id'], tuple(pool._init_modules))) |
351 | + else: |
352 | + cr.execute('select id, signal, group_id, condition from wkf_transition where act_from=%s', (workitem['act_id'],)) |
353 | test = False |
354 | transitions = [] |
355 | alltrans = cr.dictfetchall() |
356 | @@ -178,13 +203,34 @@ |
357 | return False |
358 | |
359 | def _join_test(cr, trans_id, inst_id, ident, stack): |
360 | - cr.execute('select * from wkf_activity where id=(select act_to from wkf_transition where id=%s)', (trans_id,)) |
361 | + import openerp |
362 | + pool = openerp.modules.registry.RegistryManager.registries[cr.dbname] |
363 | + if pool._init: |
364 | + # Module init currently in progress, only consider activities from modules whose code was already loaded |
365 | + cr.execute("""select wkf_activity.id, join_mode from wkf_activity left join ir_model_data md on |
366 | + (md.model = 'workflow.activity' and md.res_id = wkf_activity.id) |
367 | + where wkf_activity.id= |
368 | + (select act_to from wkf_transition left join ir_model_data md on |
369 | + (md.model = 'workflow.transition' and md.res_id = wkf_transition.id) |
370 | + where wkf_transition.id=%s and md.module in %s |
371 | + ) |
372 | + and md.module in %s""", |
373 | + (trans_id, tuple(pool._init_modules), tuple(pool._init_modules))) |
374 | + else: |
375 | + cr.execute('select id, join_mode from wkf_activity where id=(select act_to from wkf_transition where id=%s)', (trans_id,)) |
376 | activity = cr.dictfetchone() |
377 | if activity['join_mode']=='XOR': |
378 | create(cr,[activity], inst_id, ident, stack) |
379 | cr.execute('delete from wkf_witm_trans where inst_id=%s and trans_id=%s', (inst_id,trans_id)) |
380 | else: |
381 | - cr.execute('select id from wkf_transition where act_to=%s', (activity['id'],)) |
382 | + if pool._init: |
383 | + # Module init currently in progress, only consider transitions from modules whose code was already loaded |
384 | + cr.execute("""select wkf_transition.id from wkf_transition left join ir_model_data md on |
385 | + (md.model = 'workflow.transition' and md.res_id = wkf_transition.id) |
386 | + where act_to=%s and md.module in %s""", |
387 | + (activity['id'], tuple(pool._init_modules))) |
388 | + else: |
389 | + cr.execute('select id from wkf_transition where act_to=%s', (activity['id'],)) |
390 | trans_ids = cr.fetchall() |
391 | ok = True |
392 | for (id,) in trans_ids: |
This branch makes a few changes to the server to make it possible 1/ to install the demo data, 2/ to run the tests with a single update (i.e. the -u command-line flag) and have the tests pass.
This is intended to somewhat test a migrated database (which would normally not contain the demo data necessary to properly run the tests). The only trick is to set demo=True on the ir_module_module table before proceeding to the update.
So let's go through this.
- Changes in openerp/ addons/ base/ir/ workflow/ print_instance. py don't matter, this is just some minimal refactoring and exposes the ability to dump workflows as SVGs on disk.
- The if self.pool._init and False: in orm.py should clearly be reverted. This means that computing the left/right parents can't be deferred to after the tests (i.e. some tests need that information to be available).
- Changes in openerp/ tests/test_ ir_sequence. py are made to make sure the tests on ir_sequence can be played over again (i.e. one run doesn't affect the next run).
- Changes in openerp/ tools/convert. py are made so we know which data are fresh (i.e. we just created them while reading the imported file): if the data is new, let the workflow run. Maybe it is not necessary any more because of the new noupdate semantic (see next bullet point).
- Changes in openerp/ modules/ loading. py: noupdate is forced only for CSV files. Otherwise, the noupdate attribute is honored (previously, demo data were always in noupdate).
- The openerp/workflow sub-module is changed to ignore activities and transitions that are available in db but defined by modules not already loaded (this is necessary to let workflows run in tests, while being in the middle of the update module graph).